DITA

The 2016 Technical Communication Benchmarking Survey

The Content Wrangler - Thu, 2016-05-26 18:21

Earlier this year, The Content Wrangler surveyed over 700 technical communication professionals from around the globe to learn as much as possible about the state of industry. The results of the 2016 Technical Communication Benchmarking Survey are not scientific, but they do provide us with meaningful data points and help us spot trends. Our findings paint a picture of the current state of technical communication, especially as it relates to advanced information management practices, approaches, tools and planned innovations. It provides us with a snapshot of what the best-of-breed firms are doing today—and what they plan to do tomorrow. It also provides us anecdotal evidence of emerging trends, as well as a way to benchmark our efforts against the best efforts of others.

What did we learn from the 2016 Technical Communication Benchmarking Survey?

A lot. In the four years since our last survey, significant changes have taken place. New content types—like video documentation—are being produced more often by more companies. Adoption of advanced information development management technologies like component content management systems (CCMS), XML authoring tools, and machine translation are planned innovations for firms hoping to lower costs and connect content to customers.

Request a free copy of the 2016 Technical Communication Benchmarking Survey summary today!

 Man using mobile device

The web is the dominant delivery channel for product content

Up from 59% in 2012, 91% of firms surveyed publish their content to the web, making it the most common delivery channel for product content. While nearly every kind of product content is being pushed to the web, the mobile web is still a challenge for many technical communication departments. Only 24% of respondents publish product content to mobile- ready formats and/or mobile device apps.

Print is still alive and well, despite what some may think

Despite what some may believe, print is not dead. While print may seem obsolete in many ways, today it’s still the second most common delivery channel for product content; 49% of companies surveyed craft print deliverables. By comparison, only 11% create content on CD-ROM and 12% on DVD, a 50% drop in the four years since we ran our last benchmarking survey.

 Instruction Manual

A few of the biggest challenges facing technical communication departments

Keeping content in-sync can be challenging in the best of situations. But, when content is prepared for multiple audiences, in multiple languages, to be delivered across multiple channels, things can get tricky.

The 2016 Technical Communication Benchmarking Survey found that the primary obstacles preventing technical content development teams from ensuring content consistency across channels are:

  • lack of a unified content strategy (42%)
  • software tools designed to do the job (41%)

Departmental silos were mentioned as a major obstacle for 34% of the technical communication teams surveyed. Others blamed:

  • content consistency challenges on a lack of governance (39%)
  • an absence of collaboration (38%)

Anecdotally, there appears to be a lack of awareness of what’s possible. Of those surveyed, 23% complained that advances in technical communication content development are invisible to others across the enterprise, indicating a need to share our success stories, metrics, best practices, and approaches with others.

 Natural Language Processing

But wait, there’s more!

The 2016 Technical Communication Benchmarking Survey summary includes 8 pages of data covering topics such as:

  • Video documentation
  • Content strategy
  • Content reuse
  • Markup languages
  • Content quality
  • Terminology management
  • Multilingual content
  • Translation memory
  • Machine translation
  • Agile development
  • Innovations planned for the future
  • The most commonly used software products

A few details about the audience

The majority of survey respondents work for firms in the computer software and hardware sector (50%), followed by the financial services sector (7%), manufacturing (6%), life sciences and healthcare (6%), business services (4%), enterprise telecommunications (4%), universities and education (3%), defense and government (3%), mobile communication (2%), publishing and media (1%) and others.

Request a free copy of the 2016 Technical Communication Benchmarking Survey summary today!

The post The 2016 Technical Communication Benchmarking Survey appeared first on The Content Wrangler.

Categories: DITA

JOIN RSUITE AT SSP 2016, TABLE 7

Really Strategies - Wed, 2016-05-25 16:56

2015_RSuite-letterhead-logo.gifSchedule Demo

At SSP 2016, Table 7

 RSuite at SSP | 38th Annual Meeting | June 1-3


Schedule a demo at SSP and discover how RSuite can help you publish 50% faster than today

  • MS Word-based Authoring and Editing
  • Easy-to-use XML Editorial Tools
  • Semantic Enrichment
  • Automated Output to ePub, PDF, and more
  • Rules-based Packaging and Distribution
HarperCollins_Quote.gif
Categories: DITA

About Home Automation Devices

The Content Wrangler - Wed, 2016-05-25 10:00

By Tim Steele, special to The Content Wrangler

As previously discussed, home automation is a big deal these days. There are more vendors, systems, and home automation devices than ever before. The concept of the internet of things allows products from different companies to communicate and control products from other vendors because of the standards that have been put in place in the last ten years.

In my last article, I discussed home automation different systems. Now, let’s talk about devices—because without these individual building blocks, there is no system. The types of smart devices available on the market grows exponentially, it seems, and there isn’t space in a single blog post to cover them all. So, we’ll discuss the most popular ones (and maybe just a few others).

Home Automation Devices

  • Light bulbs—There are now light bulbs which have WiFi connectivity built into the bulbs. Connected light bulbs allow you to turn lights on/off remotely, without having to mess around with expensive old school approaches like rewiring your home. To get started, remove existing standard light bulbs and replace them with smart bulbs. With the help of a smartphone or smart watch app (each vendor produces their own), you can turn on a group of lights, all set at different brightness levels, and (with some bulbs) even program them to display different colors. These “lighting scenes” can be set to turn on or off at specific times or can be triggered by such things as sundown or sunrise or your proximity to your front door. You’ll need a hub—a small device attached to your internet connection so the phone can send messages through the internet to the bulbs. And, if integrated into a system like SmartThings, the bulbs can be controlled by other devices (motion sensors, presence sensors, etc.).

     Smart Watch app

  • Light switches–I have a chandelier with tiny bulbs above my staircase. No one makes smart light bulbs of that size yet (it’s just a matter of time). No problem though. I replaced the light switch with a smart light switch. Now, the chandelier comes on automatically at sunset each day. It turns off when I settle down to watch television. It comes on automatically when anyone approaches the staircase in the middle of the night. I’ve replaced traditional light switches with smart light switches throughout my entire house.
  • Power outlets–Do you ever wonder if you’ve left the coffeemaker, iron, or curling iron on as you’re driving to work? My smart power outlets in the kitchen and bathrooms prevent me from worrying unnecessarily as they turn off automatically when I leave the house.
  • Motion sensors–These are really key in my house (I suspect it’s the same in most smart houses). Those light bulbs, switches, and outlets mentioned above can all be turned off when no one is around to use them with the help of motion sensors. Pleasantly dim lights come on automatically as I walk through the house to let the dog outside in the middle of the night. I get notified if there’s movement where I’m not expecting it while I am away–and connected cameras turn on to capture video or photos of whatever is moving, so I can see what it happening. Motion sensors also help control the temperature in the house by triggering the adjustment of the thermostat in the room where I’m watching TV (instead of simply knowing what the temperature is at the thermostat in the hallway upstairs).
  • Moisture sensors–If your washing machine breaks and water is pouring all over the place, you’ll be glad you have a smart motion sensors. I’ve set mine up to text me and to turn all the lights on in the house–and to change the color of the lights to blue (get it? Blue to represent water.). I have these at every bathtub, toilet, sink, water heater, washing machine, and dishwasher in my house. You can also get humidity sensors which will automatically turn on the fan while you’re showering (no more wet mirrors when you finish).
  • Presence sensors–These little devices can go on your keychain or in your car. Some systems (like SmartThings) allow you to use your phone as your presence sensor. That way, when I leave, the doors lock themselves, the garage door closes, and I’m notified if I’ve left any windows open (I can’t close them automatically, but I can decide whether or not to return home to close them). You can even attach one to your pet’s collar to get notified if they wander away from the house.

There are so many more smart devices that you can put in your house–everything from smart watering systems that water your grass only when it’s not already raining to alarms which replace the need for that monthly bill from your home alarm company. And there’s always more coming down the smart home pipeline!

 Smart home app on smartphone

The post About Home Automation Devices appeared first on The Content Wrangler.

Categories: DITA

What is a Capabilities System?

The Content Wrangler - Tue, 2016-05-24 20:11

Capabilities are differentiators that improve a company’s ability to compete in the marketplace. In order for them to work in an efficient way, capabilities need to be part of a system stronger than the sum of its parts, and almost impossible for competitors to copy.

Watch this video by Strategy& define to learn what capabilities system is by looking at an example of a company that has one in place: Frito-Lay. You’ll discover how Frito-Lay combines its three differentiating capabilities of direct-store delivery, continuous innovation of new products, and consumer marketing into a powerful system that is at the heart of the company’s success.

The post What is a Capabilities System? appeared first on The Content Wrangler.

Categories: DITA

Need Prospects? Create Bigger, Bolder, Braver Content

The Content Wrangler - Sat, 2016-05-21 11:45

“The biggest missed opportunity in content is playing it too safe.”

And so goes the opening volley of Ann Handley’s outstanding webinar, Quality vs. Quantity: A Fight for Sore Eyes. Handley, a world-renowned content expert, is passionate about her craft. She believes that content professionals should focus on producing content that is direct, gutsy, and honest—a leaner/meaner variety that clearly stands apart from the safer “canned” versions. She mentions how quality alone renders the polarization of “quality vs. quantity” a moot point or “false choice;” and how only bold and edgy content can differentiate itself from the sea of noise that engulfs our content space.

 B2B Content Marketing SurveyHandley starts off by citing a few statistics from a recent B2B study she conducted with Joe Pulizzi, founder of the Content Marketing Institute. Among the organizations surveyed, a large majority stated that they planned to significantly enhance their content efforts in 2016. 76% plan to produce more content, while 51% plan to spend more on content.

Producing scalable quality content is always a good idea. The problem, however, is that only 30% of the B2B organizations were confident that their content was effective. In fact, for the last five to six years, the top challenge cited by these B2B orgs has always been to create engaging quality content. Producing and spending more on content whose effectiveness is uncertain is simply a “cart before the horse” blunder. Considering the time, effort and capital wasted on such a project, the negative effects can be quite significant.

Contrary to the organizational impulse to out-produce, out-trend, or out-spend rivals in the content space—that is, the production of even more “noise”—it seems to make better sense to figure out how to create engaging quality content to begin with.

Ann Handley’s advice:

Go BIGGER, BOLDER, BRAVER

Why Bigger, Bolder, Braver?

  • BIGGER stories puts your products and company in the larger context of what people care about.
  • BOLDER marketing is about tackling relevant issues head-on and in a way that most other companies might be reluctant to try. Or telling a different story from a unique angle or point-of-view that reflects who you are.
  • BRAVER tone means differentiating yourself (your content) by stating who you are and what you do in a strong tone of voice that expresses the rawness of your corporate culture or personality.

Handley provides a number of great examples demonstrating what she means by going bigger, bolder, and braver. I suggest you watch the webinar to listen to Handley explain each point. For the remainder of this post, instead of touching upon the main points, we will trace a few of the unifying concepts that implicitly drive the main ideas.

Although Handley’s solutions are spot on and pragmatic, the implications, on the other hand, are a bit tricky and incongruous: they go against the grain of common assumptions that most companies hold regarding their role in the marketing process. Let’s take a closer look.

 Think bigger

A BIGGER story displaces the role of the company/marketer

Let’s re-think marketing—namely, the force that drives it. However you choose to define the practice of marketing, it is essentially the process of producing or funneling desirability toward a product. There’s just one problem: desirability is in the domain of each individual customer, not the marketer. The most a marketer can do is to deliver content that resonates with the type of customer who just might desire the product.

The quality of your product or content does not drive desirability.

 Badass Book coverKathy Sierra, author of BADASS: Making Users Awesome, is right in stating that customers who desire a product badly enough will tolerate any imperfections or inconveniences that are built into its design. In other words, a customer who needs your high-quality product may or may not find it desirable enough to purchase. If this is true, how do you go about marketing a product?

Handley makes a great point in stating that companies need to place their products in the larger context of what their customers care about. It’s about understanding what kinds of powers you want to give your customers. It’s also about realizing that the idea of “great marketing” or a “great product” is merely a secondary effect to what’s really happening. You don’t market the product. That’s the wrong POV. Instead, you market what that customer can “become” as a result of your product.

Harkening back to Kathy Sierra, when a customer says “this product is awesome,” they are really talking about themselves and what they have become thanks to the product. That is the BIGGER story; it’s not just “about” the customer, it actually belongs to the customer. And it’s important for every company to know its place in a customer’s narrative.

Using the 3 B’s to convert customers into your squad

“You can use your bigger, bolder, braver content to convert more people into your squad, to align them with your company on a level that’s bigger than what you sell or what you do.”

Brand tribalism has been a popular theme in the marketing space for some time. A concept evolving from a number of academics in the late 1990’s, it was most recently popularized by marketer and bestselling author Seth Godin.

Handley doesn’t use the term “tribe,” but she talks about converting customers into “squads,” finding ways to “lead” them, and situating them within the context of a product culture. This is essentially brand tribalism. However, she talks about this process in relation to an open and fluid sense of corporate self-perception.

First of all, inspiring a brand/product tribe movement is a great idea. But from what position would a company lead? The tribe model is a point of view or framework used for operational convenience. A customer on his or her deathbed will not reminisce about the brand tribes s/he belonged to. It’s because the customer probably doesn’t see it that way.

Take for instance, CrossFit. It’s more than just a fitness service with a strong brand. It’s a culture; a way of life for some. And people take pride in belonging to it. But their pride as a CrossFitter is not necessarily invested in the financial and operational well-being of the company. They are invested in themselves first and foremost. Should a competing company offer a better service, the customer might as easily join that “tribe” as well.

Customers lead themselves. Preferred products just happen to be a part of their arsenal. So what is a company’s leadership role within its own internally-perceived tribe? Companies lead by making products or producing content that gives power to customers who are their own leaders.

 Brave man jumping over mountain crevice.

BOLDER approach and a BRAVER gutsier tone—displacing the common notions of “professional” presentation

Having the gumption to affirm who you are, why you do what you do, and what you are like to deal with, is far better than using canned speech. As Handley points out, such an approach will attract the like-minded and repel the timid. This approach brings to mind a maxim written by the ancient Chinese military strategist, Sun Tzu, in which he states that campaigns are won long before they are fought.

By disengaging from the more common practices of corporate self-presentation, you not only differentiate yourself, you assume the risks of “authenticity.” You win over people who essentially have already been “won,” and you repel those who matter very little to you.

By consolidating your “true” customer base, you can intensify your content efforts with laser-like focus; powering it from a source that is truly your own, and delivering it to an audience that truly wants to be there.

The post Need Prospects? Create Bigger, Bolder, Braver Content appeared first on The Content Wrangler.

Categories: DITA

What is a Differentiating Capability?

The Content Wrangler - Fri, 2016-05-20 15:05

Differentiating capabilities. They’re what management buys. But, we all too often get caught up in selling features. Watch this video from Strategy& and discover what a differentiating capability is and why they matter.

The post What is a Differentiating Capability? appeared first on The Content Wrangler.

Categories: DITA

Building A Robust Content Quality System

The Content Wrangler - Fri, 2016-05-20 14:27

Editor’s Note: The Content Wrangler is presenting a weekly series of twelve articles that provide useful insights and practical guidance for those who produce customer support websites. Columnist Robert Norris shares how to overcome operational challenges related to harvesting, publishing and maintaining online knowledge bases. His tenth installment examines the framework for a consolidated quality control program based on explicit content ownership.

By Robert Norris, special to The Content Wrangler

In a previous article, we examined how content wranglers can improve the quality of life for our colleagues in support roles and leverage insights (including customer support metrics) to our mutual benefit. This article offers insights into leveraging this collaboration to build a robust content quality system that will deliver significant improvements to content timeliness and usefulness.

Building A Robust Content Quality System

Consider this scenario:

Thanks to our colleagues on the customer support team, we content wranglers have been alerted to several deficient resources in the customer self-help knowledge base. These include several out-of-date resources along with some troubleshooting instructions that seem to cause more problems than they solve. We’ve also received feedback from senior support reps that a number of well-regarded resources are not being found by a significant number of customers seeking the information they cover.

To an organization committed to helping customers, partners and staff find useful answers as quickly and painlessly as possible, such explicit feedback is golden. Thanks to the frustrating experience of a few users, we have been alerted to deficiencies that—if fixed—will benefit many users.

To get the ball rolling, the typical first step is to notify the person who owns the problem resource and find out how long it will take to correct it. When the resource is prominent—and has a diligent owner—we can expect that an expert will be assigned to examine and repair it in short order. And that seems great; even though our quality system may not be as formal as that for our products and services, it still worked for this very important resource, right?

But, what about the poor orphans?? What’s that you say? While that one important resource from our list was promptly repaired, the others are in limbo. It seems no one is stepping up because:

  • One of the out-of-date resources was produced a couple years ago and the whole department has since been reorganized, and
  • That FAQ in need of updating was written by a long-gone intern, and
  • Those confusing instructions lack any information whatsoever about the author(s)?

As content wranglers accustomed to dealing with orphaned content, we know from firsthand experience that it is unrealistic to rely upon the availability of original authors as the backbone of our quality system. Far too often we’ve been left wondering who is going to fix the problem…and how…and when.

 

 Quality control

 

Tick…tick…tick…?Faced with uncertainty and an uphill slog to find an available expert (that is often fruitless), the repair of orphaned content quietly—if unintentionally—slips to the back-burner as higher priorities arise. Moreover—as the problem becomes chronic—awareness of a deficient resource gradually evaporates leaving it lurking like a landmine in our knowledge bases until someone trips over it triggering a serious problem:

In 2015, a prominent international relief organization settled a $10+M liability lawsuit brought on behalf of the family of a volunteer who died while mishandling a power tool. Though tragic, the individual was acting in direct violation of the organization’s safety policy which prohibited volunteers from using that particular tool. The volunteer disregarded the training received that morning which he acknowledged by signing the liability release form. Even so, the organization offered a generous settlement which was gratefully accepted by the family until a lawyer spotted an administrative error. Because the volunteer coordinator downloaded an outdated form that pre-dated the existence of the power tool, the settlement was rejected and the organization had no choice but to shell out more than twenty times the original amount.

Whether it be goods, services or publishing, our risk managers will tell anyone who will listen:

A quality system that has gaps is a dangerous source of false security

If we frankly assess our somewhat-less-than-formal content quality system, we may conclude that not only is it inadequate, it’s a self-inflicted wound waiting to happen. And even if the repercussions are not devastating, avoidable quality-related problems incur unnecessary costs we cannot afford. Given our enormous investment in knowledge sharing labor and technology and the fact that the shelf life of knowledge base content varies by myriad factors—some of which are completely unpredictable—we must have a mechanism to be able to quickly respond when alerted that any resource needs to be revised or retired.

 Magnifying glass

Conduct a self-examination

Fortunately, there is a straightforward approach to determine if our organization has an adequate quality system in place. We identify a handful of deficient resources (representing a few topics) and track the workflow activities and timing by which the problems are addressed. Though our organization’s approach may be operationally unique, the stages needed to quickly and efficiently diagnose and correct defects will mirror the following:

  1. Flag Problem–a permissioned keyboarder is prompted to log the resource into the deficient content tracking system
  2. Notification–the owner of the resource is notified and responsibility is assigned for review
  3. Review–an expert reviews the resource, recommends corrective action, e.g. revision, retirement, and examines related content to determine if there is an impact; in the interim, the resource is taken offline and/or a comment added to alert users that a fix is underway
  4. Production–the correction workflow—authoring, editing, enhancing, proofing—is scheduled and managed
  5. Publishing–as the revised resource is structured and published, metadata and navigation is adjusted and appropriate material archived, e.g. source material

Given that we are in the midst of consolidating content publishing for multiple websites, it’s likely that one or more stages in our content quality system need attention. Symptoms of underlying problems might include:

  • Unclear Ownership–it should be trivial to identify the topic owner for any resource, be it a tutorial, operations manual, form, graphic or FAQ. Orphaned content has no place on our websites.

    Tactic: As described in the content strategy article, it makes sense to assign topical content ownership at the upper-management level to establish accountability with a role that has authority. Since every resource we publish incurs a burden of maintenance, this principle places that burden on the shoulders of someone with the resources needed to prioritize and execute the task.
  • Black Holes–Given that our quality system relies upon upper-management, we should expect that there are going to be times when requests for action go unanswered. In particular, absent a tracking and reporting system that brings much-needed visibility to resources awaiting corrective action, it is likely that the needed work will not be conducted promptly (if at all).

    Tactic: A decent content management system will include functionality to track and report on resources that require attention. Lacking a CMS, one can make do by maintaining and sharing a spreadsheet-based master list that becomes the source for routine status reports to content owners.

    Important Note: When a content owner is chronically slow to repair resources, the content strategy has a mechanism to build a sense of urgency without creating conflict. The guiding principle of periodic reporting ensures that the matter will naturally be brought to the attention of the Operations Committee. If they are stymied (or choose to defer), the matter is escalated to the Sponsors’ Committee. This approach screens content wranglers by deferring to the peers of the content owner to handle the matter.
  • Missed Opportunities–Fundamental to a robust quality system is the principle that discovering a deficiency is an opportunity to seek causal factor(s). For example, finding an outdated version of a resource still online should alert us to the possibility that our process for publishing revisions does not yet require the step to seek and purge items that suddenly become obsolete.

    Tactic: Make it clear to all that we value critique by following up. When someone has taken their valuable time to alert us to a deficiency, we are wise to gratefully acknowledge the contribution. It can be as simple as offering our thanks and seeking a bit of nuanced information, e.g. search terms, navigation path. As a matter of practice, we should presume—until proven otherwise—that we’ve found a symptom of an underlying problem and dig into it.

    An especially effective approach is to host a lunch with a small group of colleagues who have a shared experience using a knowledge base to help complete unfamiliar task, e.g. recently filing expense reports for an international trip. We should also encourage our colleagues in support to engage their users (when appropriate) in the same fashion and reward both the customer and the support rep when they deliver useful feedback.

Recap

By making our content quality system airtight, we are poised to reap the benefits of our strategic commitment to consolidate the publishing operations for multiple websites. With clarity over our content shortcomings, we will make data-driven decisions to prioritize and resource our investments that optimize ROI. Strategically, we have the support of policy and stakeholders to streamline content production. To do that, we will seek the wisdom of our talented colleagues in marketing.

Last Week: Robert’s ninth of twelve articles, A Swing and a Miss: Faulty Customer Support Metrics, examines how content wranglers can improve the quality of life for our colleagues in support roles and leverage insights (including customer support metrics) to our mutual benefit.

Next Week: Robert’s eleventh of twelve articles, “Learning From the Masters,” examines how we can engage our colleagues in marketing to help us vastly improve the content we deliver to customers, partners and staff.

 Martial arts master

The post Building A Robust Content Quality System appeared first on The Content Wrangler.

Categories: DITA

Delivering HTML from DITA in The Face of Reuse

Dr. Macro's XML Rants - Wed, 2016-05-18 15:50

Delivering HTML from DITA in The Face of Reuse of Topics in a Single Publication

In DITA you can use the same topic multiple times from the same map. For example, the same user interface component might be used by several different parts of a program and you want to include that topic in the descriptions for each of those parts. Or you might have a common installation topic that uses conditional content to apply to different operating systems.In monolithic output formats like PDF and EPUB this reuse does not present any particular practical problems: because the rendered publication is a single linear flow of content, each use simply occurs in the flow and reflects whatever conditions are in effect at that point in the publication.However, with multi-file output formats like HTML, there are several practical problems.The most obvious problem is the "one result file or multiple result files?" question: When a topic is used multiple times, do you want to have just one result HTML file or do you want one result HTML for each use? The DITA Open Toolkit, through version 1.8.5, only generates a single result HTML file unless the map author specifies @copy-to on the topicrefs to the topic. (The @copy-to attribute specifies the effective source path and filename for the referenced topic so that the processor then treats that use of the topic as though it was a copy of the real topic with the specified filename.)The "every page is page one" philosophy says you should have just one result HTML file for a given topic. Likewise, searching is usually more effective if there is just one HTML file, otherwise you end up getting multiple search results for the same content, which confuses users and makes it hard to know which version to use (and may throw off search ranking algorithms that take the number of copies of a file into account in some way).On the other hand, if a user comes to a topic that is used in multiple places in the publication, how do they know which use they care about in their current access session?For the re-used installation topic example, if it reflects multiple operating systems and there is only one copy, you would appear to be required to show all the operating system versions and use flagging to distinguish them. On the other hand, if you have one HTML file for each copy of the topic, each HTML file only reflecting a single operating system, a search on installation will find all the copies, making it hard for the user to choose the right one.DITA 1.3 adds an important new feature, key scopes, which allows keys to have different values in different parts of the same map. This lets you reuse the same topic in different contexts and have content references, hyperlinks, and key-defined text strings resolve to different values in the different use contexts.For the installation example, you could have three key scopes, one for each of the operating systems Windows, OSX, and Linux.DITA 1.3 also adds the new branch filtering feature. With branch filtering you can apply different filtering conditions to branches within a single map. This lets you use the same topic in different parts of the map with different filtering conditions applied.For the installation topic you can now have a single topic as authored with content that is conditional to each operating system and then have only the operating system for the branch filtered in.It should be obvious that this must result in either three result HTML files, one reflecting each different set of filtering conditions, or a single HTML file constructed so that the browser can do the filtering dynamically, such as through different CSS files for the different filtering conditions or through Javascript or some combination.This all means that, with DITA 1.3, DITA-to-HTML processors must handle multiple uses of the same topic in a sophisticated way. The OT 1.x approach of generating a single HTML result will not work. Likewise, the OT 2.x approach of always generating a new result file works (in that it ensures a correct result) but does not necessarily satisfy requirements for minimizing content duplication in the result.So basically there is a fundamental conflict between ensuring correct content in the generated HTML when branch filtering and key scopes are in effect and satisfying the "every page is page one" philosophy.If every use of a topic results in a new HTML file then searching is impaired but HTML generation is as simple as it can be. In the context of the Open Toolkit, branch filtering (and @copy-to) is applied to create new intermediate topic files and then those intermediate topics are filtered to produce another set of intermediate topics which are then the input to the normal HTML generation process. All the data processing complexity is in the preprocessing.In order to produce a single result HTML file the processor has to determine, for the conditional content in a given topic, which content would be filtered out of all uses and which content would be filtered in any any use context and produce an intermediate topic that omits the globally-excluded elements but retains the elements included any any use. It also has to somehow record each use and how it relates to the included conditional elements so that the final HTML generation stage can retain that information in the generated HTML so that CSS or Javascript can act on it. For example, the processor might translate each unique set of filtering conditions into a single value included conditional element's @class values or it might embed some JSON data structure that establishes the map context the element was referenced in.Given this kind of information in the generated HTML it would then be possible to have the browser dynamically show or hide specific elements based on the active conditions selected by reader. By default the content could be flagged as it would be in the normal flagged output result produced by the normal Open Toolkit flagging preprocessing.However, with this dynamically-filtered HTML file there's still the problem of author knowing what use context they want to view the topic in terms of.For example, if you do a search and find this installation HTML page and open it you then have to decide which operating system you want to view it in terms of. How is this decision presented to the reader? How does the Web site track access in order to establish this use context automatically when it can?And of course the situation could be much more complicated: there could be a number of conditions against which the content is filtered, e.g., operating system, hardware platform, region, product features active, etc.I think this is a delivery challenge that the DITA community needs to address generally by both establishing best practices around content authoring and delivery, by implementing the DITA-to-HTML processing that supports generating these more-sophisticated HTML pages, and by implementing general CSS and Javascript libraries for use in DITA-based Web sites.
Categories: DITA

Why People Forget Our Content (and What We Can Do to Fix It)

The Content Wrangler - Wed, 2016-05-18 11:00

Why do people forget our content?

One of the most obvious reasons why people forget our content is information overload. In a survey of 124 managers from various professional fields in Australia, Hong Kong, the U.K., and the U.S., information overload was recognized as a top professional issue; participants confessed they find it impossible to manage information (62%), most content is irrelevant (53%), and they lack time to understand it (32%). In other words, we are drowning in data and barely have time to make sense of it, let alone remember it.

Also known as data smog, analysis paralysis, or information fatigue syndrome, information overload has been at the center of abundant research. Scientists have demonstrated through many experiments that the human brain goes through an initial stage of information processing, ultimately achieving an optimal input volume. Then when information overload occurs, the result is decreased cognitive performance. This means decreased attention span, lack of memory, and poor decision-making.

As I mention in my book Impossible to Ignore, we are left with two choices: decrease the amount of information we share with others, or help them better process information. The latter is more reasonable because it is difficult, if not impossible, to control people’s exposure to information. Reflecting on your own content, consider these techniques:

Determine the most important message that must be remembered long-term

This helps us filter a lot of unnecessary details and therefore decreases content bulk. A frequent mistake communicators make is sharing too many messages because they don’t have time to figure out which one is most important. The result is overload, and scientists agree that individuals’ reaction to information overload is omission (failing to assimilate information) and error (processing information incorrectly). Omission is selective, in the sense that sometimes people omit difficult information even though it may be highly relevant, or pay attention to something minor and therefore misinterpret a message.

If your message is complex, paint a big picture and address smaller components gradually. I often see presenters trying to explain everything all at once, which leads to the two consequences above: omission and error.

Avoid “spicing up” content with multimedia

In an effort to eliminate boredom, communicators often feel that adding multimedia, such as images and videos, will make content more exciting. This is not always the case. Research findings are converging to show us how much extra cognitive pressure the overabundance of multimedia places on people’s ability to process information and remember it. In one study, researchers asked more than one hundred volunteers to watch an online presentation about the country of Mali. Some participants watched a text-only version, while others viewed the text version plus an audiovisual component with additional information presented simultaneously. The text-only group scored significantly better on a quiz about the materials. The multimedia viewers were more likely to agree that they did not learn anything from the presentation. Use “flash” sparingly.

 Too much flash can distract viewers and impede memorability.

Extra “flash” is not always memorable. Image courtesy of Rexi Media.

Tie an important message to your audience’s current goal

People tend to pay more attention to what needs to be solved in the immediate future and to what is immediately relevant. The brain system responsible for focusing attention on what counts and ignoring everything else is called the Reticular Activating System (RAS). People are acutely interested in themselves, so when your content offers solutions to personal, nagging issues, their RAS lets you in, regardless of information overload. The old adage “know your audience” needs an update to “deeply know your audience.” The better you use their profile (likes, dislikes, immediate needs and goals), the longer you can keep them paying attention.

Have something interesting to say

The RAS does not distinguish between real events and “synthetic” reality. In other words, if a message is current, relevant, and interesting, the brain will focus. Using strong words that build mental pictures enables the brain to handle an increased load of information. For example, we may think in the age of 10-second commercials, that long, text-based ads are gone. Far from true.

Check out this ad from the Royal Parks Foundation, a charity that helps support London’s eight Royal Parks so everyone can enjoy them. Their headline provokes us:

Is your life more interesting than a squirrel’s?

We cannot help but read on:

You might think it is. But you take the same tube every day. You spend five days a week sitting at your desk, work through lunch, and stay late. You buy the same sandwich day in, day out with the same drink. You make excuses not to see friends on weekends because ‘you’re busy,’ when you’re really just watching TV. Sunday is spent recovering from Saturday and preparing for Monday. It doesn’t sound too exciting, does it?

Learn from the squirrel. His commute is a playful skip through beautiful gardens surrounding vast lakes. His office is some 5,000 acres of striking parkland. He spends time with his family and his only deadline is winter. He eats nuts but not because Nigella says so. His home has historic landscapes alongside beautiful fountains and he doesn’t pay a penny.

Now who’s nuts?

If we count the facts that the ad wanted to emphasize, there are just a few: gardens, lakes, 5,000 acres of parkland, historic landscapes, fountains, free entrance. Out of 139 words, only about 13 of them contain the essence. Yet, it is the other 100+ words that make the ad interesting. Do not sacrifice detail in the name of reducing overload. Elaboration can lead to memory when it activates multiple sensory areas. This is why good storytelling can be so powerful.

 

If you’d like to learn more about neuroscience and communication, consider attending our June 15, 2016 webinar: AGAIN: The Neuroscience of Repetition.

 Cover of the book, "Impossible to Ignore" by Dr. Carmen Simon.For more information on how the brain processes information, remembers, and decides to act, read Impossible to Ignore, available at Amazon or your favorite bookseller.

 

The post Why People Forget Our Content (and What We Can Do to Fix It) appeared first on The Content Wrangler.

Categories: DITA

Trying out Blazegraph

bobdc.blog - Tue, 2016-05-17 12:17
Especially inferencing. Bob DuCharme http://www.snee.com/bobdc.blog
Categories: DITA

The US Digital Service: Transforming the Way Government Works

The Content Wrangler - Tue, 2016-05-17 03:15

“The U.S. Digital Service is using the best of product design and engineering practices to transform the way government works for the American people…Together, teams of America’s most capable problem-solvers are striving to make critical services — like healthcare, student loans, and veterans’ benefits — as simple as buying a book online.”

In the span of just a few years, the Obama administration has launched a series of initiatives aimed at modernizing the federal government’s public-service technologies. The overall thrust and scope of these initiatives is nothing short of a major “reboot.” They aim not only to equip government with the most current technologies, but to introduce new ways of thinking and new practices that would retrofit governmental operations with the forward-leaning inclinations of a modern day tech start-up.

What’s at stake is the government’s credibility to deliver services in a manner that is adequate to today’s technological standards. As the problematic 2013 Healthcare.gov roll-out has shown, government services are only as good as the technologies used to deliver them.

Following the lessons learned from Healthcare.gov, the White House spearheaded a series of efforts to fast-track progress in two distinct, yet parallel, areas of focus: 1) the development of innovative technologies and practices, and 2) the modernization of outdated systems. Our focus will be on the modernization efforts, which are in the domain of a relatively new agency called the U.S. Digital Service (USDS).

 US Digital Service

Enter the US Digital Service

Launched in 2014 as part of the Executive Office of the President, the U.S. Digital Service (USDS) is currently tasked with the responsibility of providing consultation to federal agencies to improve and simplify their digital infrastructures. Led by its current Administrator, Mikey Dickerson, a former Google engineer who played a key role in salvaging Healthcare.gov, USDS is comprised of engineers and product designers recruited from the private sector for time-limited “tours of duty.” With a staff numbering 40 in 2015, the agency is looking to expand its staff to 500 people in 2016 to provide services across all 24 government agencies.

We will take a closer look at the USDS initiative to get a clearer picture of who they are, what they do, who they are recruiting, and what this all means (if anything) for content professionals. But first, to get a better understanding of the agency’s current role, let’s revisit the historical events that led to its creation.

Events leading up to the creation of USDS

  • 2012: The White House launched the highly-competitive Presidential Innovation Fellows (PIF) program which paired top technologists and innovators from the private sector with change-leaders from within the federal government. The PIF program was introduced by then U.S. Chief Technology Officer, Todd Park. With the goal of introducing the “innovation economy” into government, PIF worked closely with civil servants to tackle major technology-related problems in various government agencies.
  • 2013: The Obama administration launched its Healthcare.gov website, a project managed by the Centers for Medicare and Medicaid Services (CMS). What was supposed to have been one of the administration’s crowning achievements resulted in one of the greatest and most visible government IT debacles of all time. With the new healthcare initiative in crisis, an ad hoc “trauma team” of tech wizards including a number of PIF alums was hastily assembled and led by then U.S. CTO Todd Park and Mikey Dickerson, who was still working for Google at that point.
  • 2014: Following Healthcare.gov’s rapid recovery, two “start-ups” were created to accelerate the White House’s technology initiatives: 18F and the USDS. Whereas 18F’s primary focus is in developing digital products for government organizations and importing a more digitally oriented “culture” (as defined by their introduction of design thinking and lean-startup models, insistence on open-source code, and use of contemporary programming languages), USDS’ main efforts are focused toward modernizing public service tech infrastructure and minimizing the risk of future tech debacles.

Current USDS projects and publications

Looking at the current projects listed on the USDS website, you will notice a short list centered on the hotter topics of political concern: healthcare, immigration, veteran services, student loans, and small business. Although it is difficult to get a more detailed description of their operations, as most of their web content is focused on recruitment, their list of publications and projects provide some insight into their operational goals and principles.

Publications:

Their U.S. Digital Service Playbook provides 13 key “plays”—guiding principles and tactics—that define USDS’ general approach to building and enhancing digital services. Their TechFAR Handbook provides guidance for the contracting and procurement side of their playbook’s implementation.

Projects:

U.S. Department of Veteran Affairs (VA):  USDS is currently assisting the VA in rebuilding its digital infrastructure to improve and simplify claims processing and information access.

U.S. Citizenship and Immigration Service (USCIS): Along with 18F, USDS is helping the immigration agency transition from a paper-based system to the digital cloud. This entails establishing key analytics to monitor applications and issues, and conducting more extensive user research in order to provide a better end-to-end user experience.

Healthcare.gov: USDS continues to provide support for the website’s day-to-day operations, “overhauling, updating, and simplifying” its design and infrastructure.

openFOIA: Another collaboration with 18F, this project aims to develop digital products to streamline Freedom of Information Act (FOIA) requests, and to provide better access to FEC data via open API’s.

 Job candidates waiting for interview

Recruitment and expansion

As I mentioned earlier, most of the content on the US Digital Service website is heavily focused toward recruitment. For anyone interested in joining USDS, there are a number of questions one might have regarding the types of projects available, the kinds of technologists they are seeking, and the process for application.

The application page is easy to access as links are prominent on the home page. Upon clicking one of the links, you will notice that the initial application process itself is fairly simple and user-friendly; requiring applicants to answer a simple set of questions and to submit a resume.

Although the website does not list specific types of projects, roles, or responsibilities, the recruitment page asks applicants to specify their field of expertise. A drop-down menu listing 12 fields is provided, leaving an additional space for applicants to state a field that is not included on the list.

The last field tells us a couple of things. First, it is in the nature of this type of enterprise to expect a conflation or cross-pollination of skill sets and ideas. Second, as a new agency, and one following a start-up model, USDS’ formalization of roles and responsibilities, and their corresponding fields of expertise, may still be in development and subject to the demands and results of impending projects. This leads us to one last question: what roles might be available for content professionals?

 18F program home page

USDS and 18F content projects

Although the term “content strategy” is nowhere to be found on any of the USDS or 18F web pages, two projects illustrate efforts that are well within the realm of most content strategists. It’s uncertain as to whether content efforts are treated as distinct components within a larger and related organizing principle, or whether “content strategy” is viewed as a primary organizing principle conjoined with other processes.

Let’s take a look at one of their content projects: The U.S. Web Design Standards.

A collaborative effort involving both USDS and 18F, the U.S. Web Design Standards were developed to eliminate the cacophonous inefficiencies plaguing government agency websites. In cases where citizens require specific services from more than one agency, or services are provided as a joint effort between multiple agencies, consistency in content and interaction patterns is key to successfully providing those services. Prior to the standards project, this much-needed consistency did not exist.

Instead of experiencing a seamless and user-friendly process, most users found a lot of repetitive and inconsistent content spread across different agency sites pertaining to the same government programs. This created a tremendous amount of confusion and frustration on the part of the users.

Users who visited multiple agency sites came across a cacophony of different visual brands, formats, and interaction procedures. Not only did this give the impression that the agencies combined were a heterogeneous and uncoordinated set, some users began to question whether the sites they visited were indeed legitimate, or if they somehow happened to end up on a fraudulent site.

 US Web Design Standards

To add more to the frustration, many of the agency sites were not compatible with mobile phone or pad devices. At a time when a large majority of people conduct their online activities on a mobile device, incompatible sites are simply outdated. This is the case for a large number of government sites.

All of the issues combined resulted not only in bad experiences for the user, but also missed opportunities or in some cases the complete inability to find and receive services that were badly needed.

To bring order to all of this chaos, US Digital Service and 18F developed the U.S. Web Design Standards. The standards include a visual style guide and a formalized set of common UI components and patterns. Codes and designs are customizable and adaptable for reuse and repurposing. Digital tools were also created to meet the 508 accessibility standards.

Another content project, offered exclusively through 18F, is their new RFP ghostwriting service. Recognizing that the success or failure of federal IT projects often stemmed from the manner in which the Request for Proposals (RFPs) were constructed, 18F offers consultation services to review or ghostwrite agency RFPs. Their goal is to ensure that agency RFPs contain well-defined business objectives and clearly articulated technical direction to avoid any miscommunication or misunderstanding from prospective bidders.

Based on the angle and direction of approach, USDS’ and 18F’s work seems highly compatible with intelligent content principles and practices. Whether or not content strategy concepts and vernacular are recognizable within their environment, both worlds seem to operate along parallel lines, employing parallel means to address the same issues. In other words, USDS might provide a wonderful opportunity for any content strategist interested in helping to improve the federal government’s public service technologies.

Learn more about USDS.

The post The US Digital Service: Transforming the Way Government Works appeared first on The Content Wrangler.

Categories: DITA

Making Effective Communication a Priority in Humanitarian Relief Efforts

The Content Wrangler - Sun, 2016-05-15 03:05

A follow-up to “Content and Crisis: Translators Without Borders,” this interview introduces Aimee Ansari, Translators Without Borders’ newly appointed executive director.

Merging Perspectives

Translators Without Borders’ (TWB) capacity to deliver timely and professional-grade content translations across multiple language groups makes it a significant force in the deployment of humanitarian services. But the success of its operation rests on the quality and depth of its collaborative integration with other functionally diverse humanitarian agencies. In order to provide assistance during times of crisis, TWB must establish strong working partnerships with various organizations, all of which have different specializations, geographical interests, and organizational structures.

This matter alone can prove challenging. And hands-on experience working directly at the center of this type of environment matters, as it helps organizations understand the real-world challenges that such collaborations and regions entail.

This type of perspective is what Aimee Ansari brings to the table. With over 20 years of field and administrative experience providing humanitarian aid in crisis-torn regions, some of which were quite harrowing, Ansari brings invaluable perspective and approaches to TWB’s current and future projects.

 Children in the aftermath of bombing

TCW: You’ve been involved in humanitarian relief efforts with numerous organizations for the last 20 years. Tell us a little bit about how you got started.

Ansari: My start in humanitarian work was not planned. I studied the Soviet Union. When it collapsed, then those of us who spoke Russian had to go [to the Soviet Union] if we wanted to advance our careers. Because I had studied non-Russian republics, I applied for a job in Kyrgyzstan with the UN. I was lucky and got it. It turned out that part of the job was to work with Tajik refugees fleeing the civil war. After that, it was post-conflict Yemen (which has sadly, terribly re-ignited), disaster risk reduction in Bangladesh, and on and on.

TCW: What compelled you to remain in this line of work in light of the extreme risks that some of your environments presented?

Ansari: You had just driven in on LA freeways when we spoke.  To me, driving in LA seems much more frightening than most of the places I’ve been—you have road rage, drive-by shootings, I mean, people pull guns out and shoot other drivers on the freeways in LA.  That’s nuts: I’d be terrified.

In reality, I know it’s not like that—just like living in places that you hear about aren’t exactly like you see on the news. They are terrible situations, but you learn how to deal with the insecurity.

What’s more difficult is the trauma the comes with talking to people who themselves are traumatized or dying. Watching a child die from malnutrition when there’s nothing you can do, or listening to a woman talk about being gang-raped…that’s really hard. That’s what makes aid workers stop going back—because, personally, it’s difficult to hear that, see it, talk about it and only be able to do so much. Aid workers can provide food, water, shelter, and some degree of protection. But we can’t stop the killing. And that’s hard to deal with.  

 Aid worker delivering food

But, you’re right, it is much harder to sit in my house and do nothing than to use my knowledge, skills and experience to at least do something. It does take training and understanding, though. We’ve learned the hard way that giving assistance can sometimes put people more at risk. It takes experience and understanding of the context—it takes professionals—to make sure that we preserve people’s dignity, help save lives, and at the very least, do no harm.

TCW: Tell us about your experience in taking on a leadership role in a crisis hot-zone. What did you find were the biggest challenges and responsibilities?

Ansari: There are a lot of challenges in leadership roles. They are very lonely, first of all. In many places, because of the risks, you can’t take your husband, or wife, or kids. They stay behind and never really understand what’s going on in your life every day.

In really insecure contexts, though, you have to help your team manage these conditions, too. They are also alone and without their families; they are also seeing and experiencing trauma. And, as the leader, you have to always be in a position to be able to support them. For me, this meant making sure that I was healthy and as mentally strong as I could be and, as I became more experienced, learn to be able to recognize signs of trauma in myself and knowing how to deal with that.  That’s probably the most difficult—being able to say that you can’t handle it any more—and being able to have the confidence to do that as a sign of strength and leadership.

TCW: What contributed to your decision to take on the executive director role at Translators Without Borders?

Ansari: TWB is a wonderful organization. And what we do is so needed in the humanitarian community. One of the biggest problems humanitarians have is communicating with people affected by crises—we speak the wrong languages, we don’t understand their cultures, etc. We often use our admin teams or drivers to help us translate simply because we don’t know where and how to get translators. When you combine this with the use of technology to make content simpler and more accessible—it’s exactly what humanitarians needs.

When I understood what the organization does, I couldn’t understand why we aren’t much bigger, helping more people. It’s very exciting.

 Charity workers making strategy

TCW: Tell us about the future projects, direction, and goals you have planned for TWB.

Ansari: As I mentioned above, the organization has so much potential to do so much more. I would like to see us exponentially expand the number of languages—focusing on those that are spoken where disasters most often occur. I would like our library or catalogue of simplified documents (Words of Relief Digital Exchange) to include a wide range of information that every aid worker needs to communicate effectively—video, audio, recordings of simple messaging in 100s of languages and openly accessible to humanitarian organizations when they need them.

I want every humanitarian and development organization to recognize the criticality of effective communications and make it a priority in everything they do.

An example: The Ministry of Health in a francophone country would like a partner of theirs to conduct some training of community health workers based on materials that are currently in English. The partner has asked us to translate the materials into French. However, we know that only 20% of the people in the country speak French. So we have suggested to the partner (and the Ministry of Health) that we train translators to work in the two major local languages that they speak, and they can translate the health materials into those local languages. Once we do that, we can build a number of other tools that will help support on-going translation needs in those languages, making much more information available to the people in that country.

TCW: Content exists in multiple forms serving various purposes, and can be delivered through a wide range of formats and channels. Based on your past experience, what modes of content production and delivery were most effective in regions in which communications access and capabilities were tremendously hampered or lacking?

 Ruins of Kobani, Syria

Ansari: Right now, in order to help those fleeing the Middle East to Europe, it’s all about technology—websites, blogs, information in languages and in places that a highly educated, tech-savvy group access. But, in South Sudan, even in the capital, there were only a few hours of electricity per day for most people. Outside the capital, few places had regular electricity. If we wanted to effectively communicate with them, then using complicated English and Arabic (the two “official” languages) via radio programs was going to have limited effect. We had to learn to communicate both in the local language (especially if we were trying to effect behavioral change) and in ways that those cultures traditionally learned. Working with anthropologists was crucial. For example, in some countries, people communicate “lessons” via fables. In West Africa, songs are used as a way to communicate information or to express opinions. In Bangladesh, dramas or plays were used by villagers to challenge elites. These are the lessons we have to work with humanitarians to adapt their communications if we want them to be effective.

TCW: Aside from professional translation services, what else can content strategists and content creators do to help TWB?

Ansari: Donating time and money is always welcome, of course. Running a fund-raising or awareness event is very much appreciated and we need that help. We are currently thinking through some new strategies around content creation and strategies. We also invite content strategists to join us in our effort to raise awareness of the importance of content in the right language in order to maximize relevance and effectiveness. So watch this space— there will be more coming soon!

TCW: Aimee, we thank you so much for sharing your thoughts. We wish you the best and look forward to helping you spread the word about the good work done by the staff and volunteers of TWB. Thank you for all you do.

 Donate or volunteer today.

The post Making Effective Communication a Priority in Humanitarian Relief Efforts appeared first on The Content Wrangler.

Categories: DITA

Wayback Machine: How The Internet Archive Continues To Inform, Serve, and Inspire

The Content Wrangler - Fri, 2016-05-13 03:46

When it comes to finding a printed or recorded resource for educational, pleasure or business purposes, nothing beats a trip to the good old neighborhood library—that vast, utilitarian warehouse of lendable published works just waiting to be discovered.

But in the 21st Century, when time-strapped citizens have come to rely on instant access to information on an increasing array of high-tech devices, that quest for knowledge and entertainment doesn’t have to involve getting in the car, searching through the stacks, and waiting in the checkout line. Fortunately, there’s a digital library you can visit, loaded with priceless materials that can save you a trip to your favorite brick-and-mortar media depot and possibly save you money otherwise spent at Amazon, Netflix and iTunes, too.

Founded in 1996 by Brewster Kahle, the San Francisco-based nonprofit Internet Archive began life with a bold goal: to offer the masses gratis access to a wealth of digital materials and collections and serve as the online athenaeum of choice to the world, much as Wikipedia functions as the definitive complimentary electronic encyclopedia. 

 Shipping containers at the Internet Archive headquarters.

Shipping containers inside Internet Archive’s Physical Archive building. The containers are used for high density storage of physical media after it has been digitized. The Physical Archive contains more than a million books, plus tens of thousands of reels of film, LP records, VHS tapes, and other types of physical media.

The Internet Archive has something for everyone

Today, visitors from across the globe retrieve countless digitized books, films, TV clips, websites, software, music and audio files, photos, games, maps, court/legal documents and more a la the no-charge Internet Archive.

Among its impressive accomplishments over the past 20 years, consider that the Internet Archive:

  • has amassed approximately 25 petabytes of data, which includes 470 billion web captures, 8 million ebooks and texts, 2.5 million audio items (such as 150,000 live concerts), 2.2 million movies and videos, 1 million images, 1 million TV news broadcasts, and 100,000 software items
  • digitizes approximately 1,000 physical books per day in 30 centers on 5 continents
  • archives roughly 1 billion web captures each week
  • enjoys 2 to 3 million visitors per day

“The Internet Archive started as more of a warehouse for digital materials that researchers could draw on, but we have evolved to have a web presence as well,” says Kahle, whose objective is to collect a copy of every book ever published with the help of Internet Archive’s Physical Archive, established in 2011. “We are now in the top 300 websites, according to Alexa Internet.”

A robust repository from the inside out

Alexis Rossi, Internet Archive’s director of Media and Access, says storing 25 petabytes worth of digital goods on their own servers can be downright difficult.

“It’s actually more than 50 petabytes, since all our media is stored at least twice in different physical locations,” says Rossi, who notes that the Internet Archive maintains data centers at its San Francisco headquarters building and Physical Archive buildings in Richmond, California. In addition, there are partial copies of archive data in Amsterdam and at the Library of Alexandria in Egypt. “We have tens of thousands of hard drives, so there is a constant flow of drives failing that need to be replaced quickly. We also do audits of the files to make sure we aren’t suffering from bit rot.”

 Servers at the Internet Archive's data center.

Racks of hard drives in Internet Archive’s main headquarters datacenter.

But the biggest challenge of them all? “Keeping media accessible to the public. When new browsers, tablets or phones come on the market, file formats can go out of date quickly, so they need to be updated for the latest devices,” Rossi says.

None of this would be possible, of course, without the tireless efforts of Internet Archive’s dedicated team of 150 employees and countless volunteers, all serving a noble cause, Rossi insists.

“We believe it’s crucial to provide free access to information. Our society evolves because of information, and everything we learn or invent or create is built upon the work of others,” adds Rossi. “In a digital age, when everything is expected to be online, we need to make sure the best resources are available. The human race has centuries of valuable information stored in physical libraries and personal collections, but we need to ensure that all of it is online in some form.”

While people can locate media from many different important sources on the Net—YouTube, Spotify, Flickr, etcetera—“these are not libraries dedicated to keeping knowledge safe and accessible for the future,” says Rossi. “We have seen many commercial resources die off over time, including Yahoo Video, Posterous, and MobileMe. Unfortunately, when a company goes out of business, or simply decides that it’s no longer in their interest to provide a service, media disappears.  But the knowledge contained in the Internet Archive will not disappear.”

A librarian’s paradise

Jessamyn West, a library consultant and community liaison for the Open Library project, says she’s proud to be among Internet Archive’s legion of contributing librarians and Good Samaritans.

“The Internet Archive is deeply committed to free culture and sharing as much as possible,” says West. “We wrangle big grants, maintain tons of servers, get loads of volunteers and people working on important projects, and are making the content they archive as findable and discoverable as possible by using metadata such as MARC (machine-readable cataloging) records.”

Many private and commercial entities who have quality content “want to lock it up and sell you access to it,” she adds. “We make it available for free, and that’s especially important to the underprivileged and to people in other countries who may not have free access to information. This kind of access has great value, because knowledge is power.”

 Book scanning machine

An Internet Archive Scribe book digitization machine in use. Photo by David Rinehart. Copyright Internet Archive. Used with permission.

John Wiggins, director of Library Services and Quality Improvement at Drexel University, marvels at Internet Archive’s scope and influence. He says the site’s vision to preserve and provide access to selected pages from the ephemeral, continually changing web has created a historic timeline and reference source invaluable to those seeking simple answers as well as those looking to understand the changes in culture over time.

“As an information professional and a researcher at an academic university with a strong civic mission, the high value and cost of access to authoritative resources via vendor gateways—selected and managed for the academic community by the library to support the work of faculty as teachers and researchers/knowledge creators, and students as learners—contrasts sharply with the amount and quality of information available on the Internet to the average citizen,” says Wiggins. “The Internet Archive helps address this gap with free access to local history and other pages describing recent and more distant events, and broad coverage of changing web resources worldwide.”

And that’s important, Wiggins continues, because as our population ages along with the Internet, “the senior citizen of the future or the student in middle school can expect to access the Internet Archive as a memory book of personally significant events in a way that has never before been available.”

Much like both the Public Broadcasting System and National Public Radio rely on community contributions, the Internet Archive is dependent on public kindness to keep its hard drive platters spinning and the electricity turned on.

Interested in contributing? Volunteer your services or make a financial donation. You can also bequeath digital media collections like ebooks, movies or audio that you own but want to share with the public, register for a free account and hit “upload.”

The post Wayback Machine: How The Internet Archive Continues To Inform, Serve, and Inspire appeared first on The Content Wrangler.

Categories: DITA

10 Lessons Learned from the SEC Plain English Handbook

The Content Wrangler - Thu, 2016-05-12 03:00

“Investors need to read and understand disclosure documents to benefit fully from the protections offered by our federal securities laws. Because many investors are neither lawyers, accountants, nor investment bankers, we need to start writing disclosure documents in a language investors can understand: plain English.”

In 1996, the U.S Securities and Exchange Commission (SEC) launched a series of initiatives to simplify the language used in documents intended for the public. As part of a larger federal initiative toward the governmental use of plain language, the SEC’s efforts culminated in the enactment of the 2010 Plain Writing Act. This act requires all SEC documents to be written in a manner that is “clear, concise, well-organized, and follows other best practices appropriate to the subject or field or audience.”

In 1998, the SEC published its Plain English Handbook, containing guidelines to help financial institutions write prospectuses and disclosure documents in a manner that is clear and comprehensible to most investors. As virtually all investment products are fraught with risk, investors need to be well-informed as to the potential risks, rewards, and costs of a given investment product. It goes without saying that the information provided should also be comprehensive, detailed, and intelligible.

The problem, however, is that most investors find these documents to be indecipherable as they are often written in “legalese” or contain unnecessary amounts of industry jargon. The SEC’s handbook aims to remedy this situation by providing financial companies with practical tips on writing disclosure documents that are comprehensive yet readable.

Ten Lessons From the SEC Plain English Handbook

 Man reading documents

1. Reading penetrates content beyond the text

Let’s take a step back and consider what it is that we actually “read” while reading. Written content delivers the bulk of its informational payload through words and sentences. When we read, we spend most of our time deciphering information on the word and sentence level. But neither level is inseparable from the larger structural context that binds it or the marginal structures that provide it support.

While we spend most of our effort deciphering words and sentences, we also decipher meanings contained within a complex set of other factors including spatial and topical hierarchies, visual layout, typography, rhetorical nuance, tone, genre of speech (legal, casual, sales-oriented speech), and content flow. All of these factors play a significant role in structuring a reader’s end-to-end content experience. And this experience can take place within the boundaries of a single document, or across a series of related documents (i.e. marketing content, prospectus, client application, and risk disclosure).

2. Each reader approaches the text from a unique base of knowledge and experience

That each reader approaches text from a different level of comprehension may seem obvious. But considering the way most financial disclosure documents are written, such an obvious notion seems to be disregarded. When financial documents fail to engage the average investor’s comprehension level, the possibility of a misreading dramatically increases.

A reader’s personal base of knowledge serves as a context in which textual information is at once absorbed, filtered, expanded, and transformed in the course of reading. Interpretation is by nature an adaptive and transformative process. Lacking clear textual constraints, interpretation becomes increasingly transformative as it attempts to adapt to vague conditions. When it comes to conveying strictly functional information—information that is not open to multiple interpretations—the variability of the reading process is best managed through simple and clear writing.

3. Readability is determined by the efficiency of interaction between text and reader

Regardless of the size or complexity of information contained in a document, the efficiency with which a reader can decode the information is what matters most. This implies not only the ease and speed with which a reader can get through a text, but also the reader’s ability to retain and remain attentive to the information.

If a text is structured in such a way that a reader can understand the words, get through each sentence (or section) efficiently, and retain the information, then it’s conceivable that even the most complex information can be successfully conveyed. These aspects are directly connected to word choice, syntactical structure, mode of organization, and visual presentation.

 Credit balance definition written in Plain English

4. Word choice significantly affects reading efficiency

Unfamiliar words, or familiar words used in a different context, can slow down a reader’s ability to process information. When a reader comes across either case, s/he will typically search for meaning in the surrounding context. The reader will have to re-read previous sentences or sections, or continue reading only to return to the original sentence to reposition its meaning within the proper context. This process is highly inefficient. It slows down the rate of comprehension, disrupts the flow of reading, and impedes the cognitive retention and recall process.

Let’s take a look at an example from The Street:

“NEW YORK (TheStreet) — JPMorgan initiated coverage on Boeing stock with an “overweight” rating and a $175 price target.”

What does “overweight rating” mean?  Is it a good rating, or a bad one? Let’s move on to the next sentence.

“The firm said it began coverage on the aerospace company as it believes Boeing will be able to grow free cash flow by 40% by 2017. Shares of Boeing are up by 0.05% to $146.70 at the start of trading on Tuesday morning.”

Apparently, an overweight rating is positive, as Boeing’s cash flow is expected to increase. But this does not tell us what overweight means. Is the stock overweight? Does an overweight rating mean the same thing as a buy recommendation? Perhaps the confusion has to do with our more common associations with the term “overweight.”

Although “overweight” is a term commonly used in the financial industry, many investors, particularly new ones, are unfamiliar with its meaning in a financial context. Most readers associate “overweight” with health: to be overweight is to be unhealthy. Imagine reading the above sentence from this perspective. If the price of Boeing stock is currently below $175—its price target—the term “overweight” conjures up the image of a bulky stock attempting a difficult rise against the force of gravity; its own weight slowing it down or causing it to tumble.

Although this interpretation is incorrect, it is nevertheless a reasonable association. In finance, overweight refers to a situation in which an investor holds an excess of a given stock because s/he expects it to outperform other stocks in the portfolio. Overweight refers not to the stock, but to the overall “weight” of its holdings within a portfolio of stocks. Seen in this light, an “overweight” rating is a good thing.

5. Readers have deeply ingrained syntactical expectations

Most English speakers have an ingrained tendency to seek out or project subject-verb-object sentence patterns. The further a text deviates from this basic pattern, the less efficient it becomes for both reading and retention. Readers are likely to project onto it the wrong sentence structure until they are able to gradually adapt to the syntax.

Let’s take an example from the SEC handbook:

The following description of the particular terms of the Notes offered hereby (referred to in the accompanying Prospectus as the “Debt Securities”) supplements, and to the extent inconsistent therewith replaces, the description of the general terms and provisions of the Debt Securities set forth in the Prospectus, to which description reference is hereby made.

Now let’s place emphasis on the main subject-verb-object relationship:

The following description of the particular terms of the Notes offered hereby (referred to in the accompanying Prospectus as the “Debt Securities”) supplements, and to the extent inconsistent there with replaces, the description of the general terms and provisions of the Debt Securities set forth in the Prospectus, to which description reference is hereby made.


This makes it somewhat easier to read, but it still warrants revision:

This document describes the terms of these notes in greater detail than our prospectus, and may provide information that differs from our prospectus. If the information does differ from our prospectus, please rely on the information in this document.

 Document designer at work

6. Readability is also a function of design

Design serves a structural and aesthetic function. A well-designed document is one in which the structural or aesthetic function enhances the other in a balanced manner. A design can be aesthetically appealing yet poorly structured, in which case the text may be obscured. It also works the other way around—a well-structured design with bad aesthetic choices can distract or repel readers. The fundamental thing to consider, as stated in the handbook, is that “design serves the goal of communicating the information as clearly as possible.”

While design entails a very broad range of elements and strategies, the handbook focuses on five primary design elements: hierarchy or distinguishing levels of information, typography, layout, graphics, and color. As we mentioned in the first lesson (Reading encompasses every aspect of the written presentation), there’s much more to a written document than words and sentences. If tone of speech can alter the meaning of a text, then the same can be said for the aesthetic “tone” of design.

Furthermore, as design plays a dual function in structuring and enhancing text—it defines the spatial and sequential organization of information in addition to defining its sensible or aesthetic expressiveness—its role in enhancing readability is critical.

7. Simple does not mean “simplistic”

In the first chapter of the handbook, the authors set out to dispel a common misconception about writing in plain language. Plain language is not about making a text readable through the deletion of important information. It’s about simplifying complex information, not making information “simplistic” (making it appear simpler than it really is through deletion).

On the use and abuse of jargon:

Prospectuses and disclosure documents often contain jargon, which is a category of specialized terms used within a profession. Among practitioners in a specific field, jargon enhances efficiency: it serves as a form of shorthand terminology that summarizes a wide field of concepts. By summarizing a “field” of concepts, jargon presupposes a number of things: a network of related concepts, their histories and modes of use, their transformations under different contexts or circumstances, etc.

Take for example the term “arbitrage.” This simply means the simultaneous purchase and sale of an identical product in two different markets to profit from the difference in price. To a finance professional, the term arbitrage refers to a number of different scenarios: triangular arbitrage, merger arbitrage, political arbitrage, etc.; all of which involve completely different processes, histories, technologies, markets, and strategies.

 Man reading a document with a confused and frustrated look on his face

But there is also another side to jargon: specialized terms not meant to summarize a large volume of concepts but to replace common terms with the intent to misdirect. For instance, a “career-change opportunity,” or a “career alternative enhancement program,” is an empowering way to say, “Hey, you’re fired!” Another classic example is former Federal Reserve Chairman Alan Greenspan’s use of “Fed-speak” to avoid addressing issues directly (hoping that his lack of clarity will either be ignored or create enough confusion to prevent unintended market jolts).

A reader who is unfamiliar with financial jargon will not be able to distinguish between the two uses of jargon as described above. As most investors are not financial professionals, it’s best to stay on the safe side and avoid using jargon altogether.

8. Readability is a form of competitive advantage

In chapter 3, “Knowing Your Audience,” the Plain English handbook bullet-points a set of questions to help create an investor profile. There is one marginal entry—marginal because it is mentioned only once in the entire handbook and appears buried near the middle of the list (note my reading bias toward “placement” in which a bullet point in the middle is not as important as a point at the beginning or end). It reads: “Will they read your document and your competitors’ side by side?”

Prospective investors will compare and contrast financial products to find one that best suits their investment goals. It’s about comparing information…or it should be. If a prospectus is written so poorly that an investor cannot understand it, it may end being misread or avoided altogether. And the company that published the indecipherable prospectus, regardless of the quality of their financial product, ends up losing.

9. Readability encourages trust

There’s another advantage that comes with readability: it encourages trust. Here’s a wonderful quote from Deborah S. Bosley, Owner of The Plain Language Group.

“According to the 2015 Edelman Trust Barometer, only 54% of the public trust the financial services industry. One area that causes confusion and adds to this mistrust is written communication. The entire financial services industry could increase profits, decrease lost time, and improve customer relations if they wrote more financial information in plain language. Many regulations now require “clear and conspicuous” language, but perhaps a more important reason to create written information in plain language is that clear, concise, and credible communication contributes to the bottom line and to customer satisfaction.”

It’s not difficult to understand how “written communication” in the financial industry typically “causes confusion and adds to [the current environment of] mistrust.” Think about the client’s end-to-end content experience: marketing content infused with cleverly-designed rhetoric followed by indecipherable yet critical information on risks and costs.

For example, a brokerage’s marketing material touts “state of the art electronic platform for professional traders,” reinforced by “your satisfaction is our top priority.” Both of these messages are clear: their technologies are up-to-date, implying fast and reliable execution (that’s what “state of the art” means to a trader), and the brokerage will do its best to service clients (it is their “top priority”).

A client then receives a lengthy risk disclosure. Buried in the middle of the document is a statement that reads: “Electronic trading facilities are vulnerable to temporary disruption or failure. You may experience loss of orders or order priority. Your ability to recover certain losses as a result of such disruptions is limited by the system providers, clearing firms, exchanges, and brokerage.”

I have seen what happens when technology failures (e.g. a downed server) cause clients to lose money. If the risk disclosure were clearer, there would be no need for a financial representative to spell out what this small entry in the risk disclosure means: you lost money due to a failure in our technology; you are responsible for your own loss.

So much for trust.

 Group of business professionals collaborating

10. Successful content requires organizational collaboration

Last but not least, well-written financial documents require teamwork. The reason for this is that each department— marketing, investor relations, compliance, legal, design, and management—will have not only a different perspective on a product, but also a different language to express their perspective.  In other words, a financial company will need to break down content silos in order to create a comprehensive and clearly written document.

Let’s take a brief detour. In finance-speak, the term “volatility” refers to rapid and sharp price fluctuations. A stock is considered volatile if its price movements make sudden and dramatic directional changes on a regular basis. Volatility may result from a scarcity of market participants, in which a few hold significantly larger amounts of capital, allowing them to move markets; or volatility can reflect conflicting market views or sudden shifts in market sentiment. Price movements, no matter how smooth or volatile, are always comprised of heterogeneous perspectives and actions. This is analogous to the process of constructing and conveying product identity.

The process of putting together and presenting an investment product involves multiple efforts; multiple angles of approach. One department’s concern may not be shared by another. Yet, every department is working together to develop a product offering that appears seamless, linear, and singular. Without a coordinated approach to “smooth out” the perspectives on a given product and the language used to describe it, then the end-to-end product experience from a content standpoint may exhibit a sense of rapid angularity or incongruity (as in the rapid shift from enticing marketing rhetoric to indecipherable legal-speak). In a manner of speaking, this constitutes kind of “volatility” in the realm of content and product experience.

In conclusion, the simplest means of conveying information are usually the best. But it’s also important to note that “simplicity” is both a variable and adaptive principle: it should match the level of complexity required to engage or explain a topic. With this in mind, it’s really important to know your audience. And if you truly intend to engage them, make a sincere effort to speak or write in a language that they understand.

The post 10 Lessons Learned from the SEC Plain English Handbook appeared first on The Content Wrangler.

Categories: DITA

RSuite CMS Developer License Now Available to the MarkLogic Community

Really Strategies - Tue, 2016-05-10 14:00

RSuite-Publishing-Automated-Logo.gifAudubon, Pa. – May 10, 2016 RSI Content Solutions announced today at MarkLogic World 2016 the availability of an RSuite Developer License.  RSI kicked off 2016 by releasing RSuite 5, the latest version of its highly configurable, multi-channel, automated publishing content management solution. Due to many requests after the RSuite 5 release, RSI is now officially offering a full, no-cost version of RSuite Developer License to the MarkLogic community.

RSI is also launching a completely revamped and enhanced online user community. The new community will provide a much more user-friendly interface and delivers easier access to support, documentation, user forums, product videos, and downloads.

“The past few months have been an exciting time for us with the release of the RSuite Developer License,” stated Lisa Bos, Chief Technology Officer at RSI Content Solutions. “Now that we’re offering it alongside our new RSuite user community, it will allow us to expand our user base with ease and allow a seamless transition for our customers to use RSuite on a daily basis while increasing their production workflow and decreasing their product time to market.”

To download and register for the RSuite Developer License, as well as access documentation and other training materials, please visit the RSuite CMS Community.

About RSuite

RSuite has been built to serve as the centralized enterprise publishing solution for organizations who wish to automate their publishing processes and reduce time to market by over 50%. RSuite is optimized for the creation, management, repurposing and multi-format, multi-channel delivery of content by utilizing an enterprise?strength native XML repository which stores and indexes XML content in its natural hierarchical format.  In addition to its strong XML capabilities, RSuite manages any and all forms of digital assets (MS Word, PDF, images, audio, video, etc.) and all of its associated metadata. 

RSuite’s powerful and highly-configurable workflow engine allows customers to implement multiple workflows that incorporate both manual and automated tasks, such as transformations, packaging, delivery, and more.  Customers are implementing RSuite to manage the end-to-end publishing process, from content creation through multi-channel, multi-format deliveries.  For more information, please visit http://www.rsicms.com/rsuite-enterprise-publishing-solution

About RSI Content Solutions

For over 16 years, RSI Content Solutions has been at the forefront of implementing content agility solutions for publishers, media companies, Fortune 1000 businesses, government organizations, and more.  With headquarters outside Philadelphia, PA, USA, a MarkLogic engineering center of excellence in Chennai, India, and affiliate offices around the world, RSI has helped over 250 global organizations implement appropriate content agility solutions. For more information, please visit www.rsicms.com.

Categories: DITA

The Content Industry Talent Shortage: What You Can Do About It

The Content Wrangler - Mon, 2016-05-09 03:31

Find yourself juggling extra content responsibilities at work lately? Forced to learn new apps, dashboards, templates and markup languages than was required in your job description? Notice a few more empty desks around the office than normal?

Welcome to the world of the chronically understaffed content team, where highly skilled co-workers quickly bolt for bigger bucks elsewhere, fresh for-hire candidates who graduate with inadequate aptitudes jockey for mediocrity, and the required technology powering publishers and marketing initiatives seems to change faster than the frequency between political attack ads on television.

Just how pervasive and impactful is this content industry talent shortage? Consider that, based on results of the Content Marketing Institute’s (CMI) 2016 B2B Content Marketing report, 25% of North American B2B marketers reported that gaps in knowledge and skills of their internal team was one of their top five challenges, and 21% said finding or training skilled content marketing professionals and/or content creators was one of their top five challenges.

Similar findings were revealed from CMI’s 2016 B2C Content Marketing report: 21% of B2C content marketers said gaps in skills and knowledge of their internal team was one of their top five challenges, while 18% said finding or training skilled content marketing professionals and/or content creators was one of their top five challenges.

For its 2015 study, CMI asked their “challenges” question differently (by presenting a list of challenges and asking respondents to “select all that apply” vs. asking them to identify their top five challenges). In 2015, a whopping 32% of B2C marketers said they were challenged with finding trained content marketing professionals vs. only 10% in 2014.

 Empty workstations

Why is there a content industry talent shortage?

Michele Linn, vice president of content for CMI, says several factors have contributed to the current talent shortage.

“The content industry is an emerging field, and businesses are trying to figure out what they need to do and who can help them. It can be tough to identify the requirements,” she says. “Plus, there is a lot of overlap and confusion with roles. Additionally, there are people who work on the front-end and the back-end in both the content strategy world and the content marketing world.

Another problem is that the content industry is still a fairly new one, according to Ann Rockley, CEO of The Rockley Group Inc.

“The concept of content strategy has been around for almost a decade, but it really only burst into public awareness in 2009 with the publication of Content Strategy for the Web,” says Rockley. “Content marketing is among the newest of the content areas and, therefore, the area with least amount of resources. Many employers have only begun to recognize the need and value of this kind of talent. And as a new industry, there are very few educational programs or other means of getting accreditation in the field. That means that the primary way of learning the job is through on-the-job experience.”

But even job experience is challenging to obtain because many companies don’t yet have the positions available “or are looking for people with longer experience than the industry has even existed,” adds Rockley. “It becomes a vicious cycle.”

A shortage in the right skill set—not necessarily talent—is the bigger quandary today, insists Sarah O’Keefe, president of Scriptorium Publishing.

“It’s not a lack of ability or creativity, it’s the fact that the people in the industry have not often acquired the required technical skills, which keep changing,” O’Keefe says. “It used to be enough to be a good writer or creative, but now content creators and marketers are being asked to do more technically.”

Jay Acunzo, vice president of platform at NextView, agrees that the problem is not so much that there’s a lack of talent, “but rather the wrong mentality for brands to successfully attract that talent—and for them to seem attractive to that talent,” he says. “That’s a missed opportunity for both job seekers and employers.”

 Team of knowledge workers in planning meeting

The most affected areas

Rockley says content marketing has suffered the worst shortage, since marketing is the most recent to adopt content strategy principles.

“However, content strategy is also experiencing large shortages, as the field is now well recognized as valuable, but education is only recently begun to be available and it is not widespread,” notes Rockley.

The area O’Keefe has observed as most in need is enterprise content strategy. “The technical skills that go with that job, particularly information architecture, can be especially lacking,” she says. “In the world I work in, we also have a lot of demand now for XML-based authoring and XML configuration skill sets—essentially programmers who know how to configure complex systems to support XML-based publishing. And writers have to be able to write in an organized, structured way within templates, too.”

In her experience, Linn also identifies a lack of specialists who can merge all of the content marketing and strategy disciplines together to create the best possible user experience.

The fallout and the forecast

To underscore the affect a content talent dearth can have on your business, consider this scenario: A technology company is creating a content marketing group and is looking to fill a new position. Frustrated by a lack of qualified applicants, the company settles for a marketer who is well versed in traditional methods but who does not fully understand the merits of content marketing. And that’s when the problems start.

“It becomes apparent that the new hire doesn’t understand that the goal is to build an audience through educational content—she’s instead pumping out content that is focused on product and not interesting to your audience,” says Linn. “Moreover, she doesn’t grasp how to set up a repeatable process to get things done efficiently. She lacks the social marketing background to formulate an effective distribution plan. And perhaps she doesn’t understand analytics to know what is and isn’t working.”

Rockley believes the content industry shortage has the legs to be a chronic problem for the foreseeable future.

“Twenty-plus years ago, there was a shortage of information architects and usability specialists. Now, education is widespread and there are sufficient resources that have learned on the job to largely satisfy the need,” says Rockley. “This is a cyclical trend. There will be some other type of resource in the content industry in the future and the lack of resources trend will move to that role.”

Linn agrees that it will be continually problematic to find people who are keeping up with trends—due to the constantly changing nature of marketing. “But hopefully, as the field matures, more people who are educated on the basics of content will be better available to help companies,” she says.

 Skills required

Strategies and solutions

To improve your organization’s chances of attracting and retaining higher-skilled and talented content creators, marketers and strategists, try these tips:

  • Have a plan and know what kind of particular professionals you need. “Be specific with what gaps you are trying to fill,” Linn says.
  • Create a positive environment and workplace culture that welcomes, rewards (both monetarily and with exciting fulfilling tasks) and supports the talent you recruit, suggests Rockley.
  • Value past work samples over resumes. “Who cares if they say they can write or create or produce good work? Show me,” says Acunzo.
  • Give creative employees the box to play in and get out of the way. “Top talent is fine playing within a box if it’s well-defined and agreed upon. It’s actually liberating and enables better ideas and execution to have boundaries,” Acunzo says. “But too often, leaders who can’t retain this type of talent typically jump into the box to meddle too often, even after it’s been agreed upon up front. Put it to writing, create a paper trail of the project, and then let creators create.”
  • Consider growing someone from within. “Choose a person who has deep knowledge of your business and eagerness to learn. Send them to conferences, pay for courses and reward them with success,” Rockley says.
  • Try hiring an intern who can be educated in the area you’re lacking talent.But pay that intern a fair living wage,” recommends Rockley.
  • Supplement your whole team. “Hire enough people so those you have working for you don’t need to be jacks of all trades,” adds Linn.

The post The Content Industry Talent Shortage: What You Can Do About It appeared first on The Content Wrangler.

Categories: DITA

A Swing and a Miss: Faulty Customer Support Metrics

The Content Wrangler - Thu, 2016-05-05 15:42

Editor’s Note:  The Content Wrangler is presenting a weekly series of twelve articles that provide useful insights and practical guidance for those who produce customer support websites. Columnist Robert Norris shares how to overcome operational challenges related to harvesting, publishing and maintaining online help knowledge bases. His ninth installment examines how content wranglers can improve the quality of life for our colleagues in support roles and leverage insights (including customer support metrics) to our mutual benefit.

Customer Support Metrics

Organizational leaders love their dashboard metrics, and when it comes to their support teams, those metrics come by the boatload: response & resolution, escalation, abandonment, average talk time, etc. In most organizations, customer support metrics focus the team’s attention on solving each user’s problem as quickly and efficiently as possible.

From the perspective of a content strategy that emphasizes quality control, this emphasis on getting the customer off the call ASAP is hardly strategic.  For example, time-to-resolution metrics reward a support team for speedily solving the needs of multiple customers?—?each of whom had the same problem?—while leaving the problem’s cause untreated for hundreds more. Moreover, speed-based criteria promote risky corner cutting and undesirable behavior, e.g. support representatives saving files locally for quicker access, distracted listening, jumping to conclusions and interrupting.  In cost-benefit terms, we should be aware that every time we fail to use the experience of a single user to identify and correct a deficiency in self-help support, we have missed the opportunity to solve the problem for every user.

Support center personnel are typically characterized as being the tip of the spear because they represent the organization to the client via direct contact. It’s a fair characterization, but does not fully leverage their unique skills and opportunities. An astute content wrangler recognizes that support center personnel are perfectly placed to assess the content experience of a single user to the potential benefit of all users.

Consider:

  • A user engaging the support team is a potential source for discovering a deficient self-help resource
  • For each user surfacing a topical problem, there are likely to be many users with neither the time nor the energy to do so
  • A support team member who is alert for content deficiencies is contributing to our organization’s continuous improvement; one who is racing the clock is doing the opposite

 Why?

 

What If We Focus On Why?

Let’s consider what would happen if success were defined by having our support representative engage select users to determine why they were not able to locate what they sought. Circumstances have brought these two together because the user was not able to solve the problem on his or her own. In collaboration, the customer support representative and the customer have the potential to discover lurking problems, including:

  • Confusing instructions
  • Gaps in the knowledgebase
  • Non-intuitive navigation
  • A layman’s synonym for company terminology
  • Out-of-date content
  • Duplicate resources
  • Contradictory content
  • Malfunctions

With revised customer support metrics that reflect the quality-focused principles of our content strategy, management will soon recognize that there is a much higher return on investment when a customer support representative takes 10 minutes to identify and document a deficiency on the website than for celebrating the prowess of ten reps taking only one minute each to repeatedly solve the same problem.

Of course, judgement is required. When the queue is full or the user is agitated, let’s not dally?—?solve the problem and move on. But when the universe is smiling upon us and an articulate, patient user is asking for help and call volume permits, let’s empower customer support personnel to discover and document the deficiency to solve it for all users. Then reward the support representative and their supervisor. By next week, we’ll have the whole team filling the QA pipeline.

 Support center

As content wranglers, we are in a unique position to help the organization embrace the expertise of our colleagues on the support team and dramatically improve the nature and impact of their work.  We can initiate the transformation by asking highly respected customer support representatives to share their frank perspectives on the usefulness of the knowledge base(s).  After gratefully receiving their critique we need only act upon what we learn to demonstrate our sincerity and capability to make their jobs easier.  From a technical standpoint, it is likely that we can leverage their technology, e.g. Salesforce, to help our colleagues quickly capture the information we need and alert us to the problem.  Moreover, we benefit by gaining useful insights into the day-to-day needs of our audiences, which will help us set priorities and tweak our resources to better serve them.

Recap:  Every confused user dealing with a support rep represents an opportunity to evaluate the efficacy of our self-help resources, but when management’s zeal for efficiency inhibits the attention to detail needed to continuously improve our effectiveness, it’s a swing and a miss.

Last Week:  Robert’s eighth of twelve articles, Best Practices for Fostering Support from Stakeholders, shares insights into how we can gain the support of stakeholders in middle and upper management.

Next Week:  Robert’s tenth of twelve articles, “It’s About Quality,” offers insights and practical guidance on how to invoke a quality control program that boosts productivity and streamlines operations by clarifying roles, responsibilities and workflow.

 Five stars. The highest rating.

The post A Swing and a Miss: Faulty Customer Support Metrics appeared first on The Content Wrangler.

Categories: DITA

Gilbane Advisor 4-28-16 — Bots vs apps, content management news, media, more

Bots vs apps, conversational interfaces, AI — lots of hype, and lots of money. If all the coverage of these and how they relate have left you scratching your head our first three articles below will get you grounded in reality. The first, a longish but enjoyable post by Dan Grover is my favorite. He uses the sneaky trick of measuring how many screen […]

This post originally published on https://gilbane.com

Categories: DITA

Workshop: Use Brain Science to Control What People Remember

The Content Wrangler - Thu, 2016-04-28 21:11

Audiences will forget most of the content you present and the little they remember is random. Using neuroscience-based guidelines (brain science) helps you to master with precision what audiences take away from your content and what they are willing to do with it.

During this full-day workshop, Dr. Carmen Simon of Rexi Media will dive deep into brain science techniques and teach you how to direct an audience’s attention to what counts, ensure your most important content is memorable, and make it easy for audiences to reach a decision in your favor.

You will work on your own content, and leave with practical skills you can apply immediately to any type of communication material.

Specifically, will learn how to:

  • Identify three essential criteria for a memorable message
  • Master brain science techniques to capture and sustain attention, and avoid the most common errors that lead to forgettable content
  • Implement a key technique that is missing in most business presentations to change an audience’s behavior
  • Apply a 5-step, sure-fire persuasive script to spark action

Register: Use Brain Science to Control What People Remember
Where:  Mark Hopkins Hotel- San Francisco
When: May 20th 8:00-5:00 pm

About the instructor, Dr. Carmen Simon
Dr. Carmen Simon is an experienced cognitive scientist, published author, and a frequent keynote speaker at conferences in the U.S., Canada, Europe and Asia. She holds doctorates in instructional technology and cognitive psychology, and uses her knowledge to offer presenters a flashlight and a magnet: one to call attention to what’s important in a message, the other to make it stick to the audience’s brain so they can act on it. Carmen’s presentation brain science coaching helps business professionals motivate listeners and stand out from too much sameness in the industry.

rexi media

The post Workshop: Use Brain Science to Control What People Remember appeared first on The Content Wrangler.

Categories: DITA

Why is DITA Important?

The Content Wrangler - Tue, 2016-04-26 20:41

Why is DITA so important?

Information is expanding across the enterprise. Businesses are creating and filling corporate digital landfills, clogging operational processes without knowing their complete content holdings or the value contained within.  Imagine a world where you assemble a document rather than write it out word for word each time. Imagine that content relevant to what you are creating has already been created and is now made available for you to reuse. With the Darwin Information Typing Architecture (DITA), it’s possible.

Check out this video from the folks at Precision Content. We need more content like this; content that makes DITA less obtuse and the value easier for upper management to understand.

The post Why is DITA Important? appeared first on The Content Wrangler.

Categories: DITA
XML.org Focus Areas: BPEL | DITA | ebXML | IDtrust | OpenDocument | SAML | UBL | UDDI
OASIS sites: OASIS | Cover Pages | XML.org | AMQP | CGM Open | eGov | Emergency | IDtrust | LegalXML | Open CSA | OSLC | WS-I