Adobe DITAWORLD 2020 – Day 2 Summary by Kyra Lee and Grace Smith
by Stefan Gentz
Posted on 10-08-2020
Just like yesterday, the second day of DITAWORLD 2020 kicked off with a warm welcome from our co-moderators: Stefan Gentz and Matt Sullivan. Many of the same members of the audience present for day one were also available to join us in the chat for day two (score!), along with some new ones (amazing!) Once the general overview of the day concluded, we were able to jump right into the content with our first engaging presenter of the day. Let’s get right into it!
In this post:
- [Welcome Note] Stevens (CIDM/Comtech): Welcome to Adobe DITAWORLD 2020, Day 2
- [Keynote] Stark (Adobe): Transforming Content for the Chaotic World
- Herian (3M): Working at the Speed of Covid
- Aschwanden (Publishing Smarter), Porter (Gulfstream): Automating DITA Publishing
- Wortima RS (Adobe): Cloud or desktop? Or both?
- Fleischer, Sarette (Palo Alto Networks): Our Journey from DevOps to PubOps!
- Braster (Etteplan): The Value of User Centric and Quality Content
- Machert (Congree), Swisher (Content Rules): Consistent Structure, Consistent Style
- O’Keefe (Scriptorium): Beyond the OT
|## Welcome Note### Welcome to Adobe DITAWORLD 2020, Day 2Dawn StevensDirector and President at Comtech Services and CIDM|
Dawn Stevens, Director and President at Comtech Services and CIDM, started the day off by setting the stage for Day 2 of DITAWORLD. According to Dawn, DITAWORLD is like a musical. It is also a collection of stories, albeit without the music and dance like Broadway musicals (unfortunately). Even though she is just joking about putting together a musical about DITA, aren’t we all storytellers? Aren’t we all, as technical writers, trying to tell a story about DITA?
There are many roles DITA play in your story:
- Quirky sidekick
- Obstacle to overcome
- Secret weapon
- The happy ending
No matter what you are planning, DITA has a role in how you tell your story and create impacts.
So, as a technical writer, how does DITA fit into your narrative?
|## [KEYNOTE] Transforming Content for the Chaotic World### Why structure and insights are ever more important in the digital-first customer journeyLoni StarkSenior Director, Strategy & Product Marketing at Adobe|
We welcomed Loni Stark, Senior Director, Strategy & Product Marketing at Adobe, back this year at DITAWORLD as she presented about why structure is the bedrock of creativity in this chaotic world.
The pandemic forced Loni to transform her dining room into a work/painting/living space, where she deals with her creativity chaos. She shared a picture of this workspace with us, and though it might look like it, there is organization within this chaos. If you take a closer look at the cabinet, you can see labels (or “metadata” as she put it) that tell her what is in each drawer. After hearing how she organized her things inside the drawers, I am sure everyone resonated when an attendee said: “oh, those drawers are the dream of any Info Architect and Taxonomist”.
So, what do Loni’s drawers have to do with transforming content for the chaotic world? The answer is “everything”!
2020 events have accelerated digital transformation, and COVID-19-driven digital adoption rates have covered decades in days:
- E-commerce grew 10 years in 8 weeks
- Telemedicine grew 10 times in 15 days
- Remote working has 20 time more participants in 3 months
- Remote learning grew 250 million in 2 weeks
- Online entertainment grew 7 years in 5 months (I must admit that Netflix has become my best friend in this pandemic …)
Adobe is presented with a unique challenge to help organizations become more agile and nimble in how they communicate. 2020 has been a challenge but also a growth year for many as businesses move their content online. Having digital channels is now more important as ever. The key to retaining customers is to understand them better and more rapidly. To do so, your business must:
Be proactive in communication and outreach
- Make sure your customers know about your policies and new programs available to help them during this challenging time.
- Make sure your B2B interactions can be done digitally.
Become digital and accommodates self-service
- Make sure your customers can achieve their goals with your business and evaluate things digitally.
Shift brick and mortar strategy
- Make sure your customers can reach you and your products digitally.
Have community support
- Make sure your customers feel like they are a part of your business and that you can help each other.
Have new ways of working
Having intelligent data management allows teams to respond instantly and appropriately to what matters the most across the company’s infrastructure. Being able to get real-time analytics and information help you determine which areas need improvement or transformation NOW. Analytics allow you to understand how customers engage with your content and help you personalize the customer experience.
So, how does structured content improve customer experience?
At Adobe, Loni said that they have seen how unstructured information creates chaos. Having content that is locked inside a tool or format stops technical communicators from reaching their goals. Lacking structures in content could slow time to market, increase risks for inconsistency and inefficiency, and makes personalization ineffective. Having legacy infrastructures that cannot fulfil current demands can also cause performance and scalability issues.
Like how Loni organizes her art supplies, structured content management requires a standard for efficient reuse and streamlined workflows. This is where AEM comes into the play. AEM is a single platform that allows you to easily migrate and create content, manage content, and collaborate with others, and deliver personalized content. AEM is helping us reimagine the workflow of a technical communicator.
To reiterate, businesses must:
- become digital-first and work to understand their customers in real-time through analytical data
- have the content in a structured way to respond to customers needs and be more strategic in your approach.
|## Working at the Speed of Covid### How 3M Content Managers used Structured Authoring to Keep Content Up to Date in a Fast Changing Covid WorldTeeghan HerianTechnical Content Management Supervisor at 3M, USA|
Teeghan Herian, Technical Content Management Supervisor at 3M, USA brought us a perceptive customer story on how 3M content managers used structured authoring to keep content up to date during these challenging times. Teeghan went over the following:
Once COVID hit, there was a near immediate requirement/need to create 70+ documents with not many resources. This included technical bulletins, FAQs, tips sheets, how-to posters, and reference guides. These documents needed to be specific to COVID-19 and for three different audiences:
- The general public
- Non healthcare workplaces
- Healthcare workplaces
3M originally relied on their subject matter experts to relay the information. However, this cross-functional collaboration took a lot of time to accomplish the goals. Although it met the customer’s needs, it did not help the technical communicators process. Changes were happening everyday as new COVID-19 information was uncovered. 3M required consistency for their customers and their current processes were challenging to work with.
The plan: Start small
3M wanted to select documents that were similar then compare the similar documents and trainings to determine if there were any sections or statements that apply to:
- ALL documents
- SOME documents or
- ONE document
3M started with one group of documents (such as their FAQs) that had similar content such as images, graphs, section titles, statements, and warnings to create the scope of documents for reuse. This ensured a consistent message no matter the audience level chosen on their website.
3M was already migrating completely to structured FrameMaker and found it pretty easy to switch as they tackled the wave of need for COVID-19 related documents. A great effort by their team (starting sometimes as late as midnight!) allowed them to fully migrate from unstructured to structured FM. Moving forward they had an automation for this process to provide a consistent look and feel for their documents.
Teeghan shared a visual example of this process which included:
- Creating the topics
- Inserting the conrefs
- Assembled DITA maps with consistency
Reusing the links and the questions made a lot of sense. Wherever the question may exist (internal or external) there was huge value. As soon as they had one update, it updated anywhere that question occurred. The use of conrefs gave 3M that flexibility they needed. Having the FAQ’s be topic specific without being overly specific allowed them to reuse these in any general bulletin or FAQ moving forward. Example used: Can FFR’s be shared? No, they should not! Not during COVID-19, not before, not ever! Once they hit publish, 3M was able to produce consistently structured and on brand content for their intended audiences that fit all their needs.
How 3M measured success
Both internal and external customers were very happy with the consistent look and feel of the content and made that apparent in their feedback! Also, the automation possible through this process greatly improved the time it took to get from initial start to publish.
- Looking at reuse across large amounts of product or topic specific documents is difficult to do.
- Smaller groups work better (for them)
- Small changes within those documents to align content helps drive process efficiencies in the future
- Changing from unstructured to structured FrameMaker was easy.
- Structured content also saves money when translating
- Reuse is critical for a fast-changing environment. Especially during a crisis like COVID-19 (Does it feel like the only thing that moves faster than COVID-19 these days is the speed of light or is it just me?).
A couple of notable question from the Q&A pod that Matt Sullivan relayed at the end of Teeghan’s compact and highly informative presentation:
Don asked: Are you using a component CMS or AEM? Answer: We are currently migrating to AEM!
Susan asked: How long did the transformation take and how many people were involved? The answer: Thanks to the automation possible for the process, it took from December 2019 to July 2020 to really get their footing. It took the three of them plus outside help when necessary.
Matt Sullivan moved onto a fun little trivia session as Teeghan ended a bit early, asking “Does anyone know what 3M stands for?” The answer: Minnesota Mining Manufacturing. Quite a few people in the chat knew this, did you? (Now I do!)
|## Automating DITA Publishing### Automating aerospace: How Gulfstream publishes style-rich, interactive PDFs from DITAPatrick PorterProduction Test Pilot and the Manager of the Flight Crew Publications Group at Gulfstream Aerospace, USABernard AschwandenCEO at Publishing Smarter, Canada|
Bernard Aschwanden, CEO of Publishing Smarter, and Patrick Porter, Production Test Pilot and the Manager of the Flight Crew Publication at Gulfstream, gave a presentation on how Gulfstream automated the publication of their aerospace materials by migrating to DITA. After helping Bernard and Patrick with their content migration to AEM for months, it is great to just sit back and hear their perspective on the project. Well… not really because I must write this post. Anyways, let us jump in and see what they have to say about migrating aerospace content to DITA, and how DITA can change the way we fly!
When Bernard was introducing himself, his audio started breaking down and a lively discussion broke out in the chat about the strength of Canadian internets. Canadian WIFI, am I right? With Bernard away to fix his audio, Patrick gave us an overview of the kind of content they are moving to AEM.
One of the long-term goals of this project is to enforce consistency on all their content because they have documents that spanned multiple decades, since 1958. Their objective is to migrate all content (many was authored in Microsoft Word) to structured DITA. Patrick said he “jumped into the deep end of the pool” without knowing anything about XML or DITA all because Gulfstream let him fly if he agreed to take over the project!
So why did Patrick choose DITA? The short answer is that their consultant recommended them to use AEM, and you have to use DITA with AEM. After working with DITA for a while, however, Patrick found it to have a relatively simple structure that enables faster authoring and is less prone to XML structural errors.
The three main objectives of the project are to:
- Automate publishing and release for content
- Create a digital workflow (going paperless)
- Comply with brand guidelines for consistency
CAS messages (warning, caution, advisory) is one example of what they have digitized. Digitalizing them helps pilots respond to situations and issues faster. With the new interactive buttons at the top of the page, pilots can now access information in a matter of seconds instead of having to flip through physical pages.
Before their conversion to DITA, Gulfstream used different tools to design and generate their Tab Index Page (the cover page of their flight manuals). This made localization and content reuse nearly impossible! Bernard and his team at Publishing Smarter suggested using XML codes to generate the buttons for the tab index page, which automated the design of the document and eliminated a lot of manual efforts.
Patrick and Bernard discussed some of the challenges they have overcome during the project. The session slowly shifted from a presentation to a discussion with the audience on how XML codes automated content design, structure, and publishing.
They also created a mechanism in FrameMaker to convert DITA maps to different versions of PlaneBook. At the click of a button, each map is published as a collection of content (over 2000 pages!) that has all the interactive functionalities.
In the long term, Patrick wants all their technical writers to be able to work in FrameMaker and AEM and have a completely digital workflow from end to end.
If you must learn only one thing from this presentation, it’s this: content design and function can create a complete experience from the author to the final content consumer.
A few of the notable questions from the Q&A pod:
Q: Also, does AEM allow you to author the content? or did you use any other editor/tool?
A: AEM does have an editor tool, but you can ALSO connect to other tools. In our session (next) Patrick from Gulfstream is talking a bit about their author workflow and the longer-term goals for tech people and contributors.
Q: Why did Gulfstream choose DITA over S1000D?
A: S1000D is too complex for what they had to do at Gulfstream. DITA has a much easier and simpler structure to work with, which also helps eliminate risks of errors.
|## Cloud or desktop? Or both?### Leverage the full potential of DITA Authoring & Publishing with the power of two heartsWortimla RSSolutions Consultant at Adobe, USA|
Wortimla RS, Solutions Consultant at Adobe’s presentation and demo showed us how to leverage the full potential of DITA authoring and publishing with the power of two hearts, through a story: “Once upon a time, there were two products: XML documentation for AEM, and FrameMaker”.
XML documentation for Adobe Experience Manager, (which we may or may not know by now) is a component content management system (CCMS) for technical documentation. XML documentation for AEM manages content from creation to delivery and provides a consistent customer experience across all touchpoints:
- Web based content creation
- Structured content management
- Online review and collaboration
- Immersive omnichannel content experiences.
Product #1: XML Documentation for AEM
A simple web editor: easy to use, yet powerful. Wortimla warned: do not let the simplicity fool you. It is friendly for all types of users regardless of experience, but it is powerful!
An extensive metadata and taxonomy handling.
Advanced version management: extensive options with branching reverting, check in/out options and more.
A collaborative real-time online review: Easily send out a single topic or maps to stakeholders of your choice. You:
- Decide who reviews which topic
- Decide which baselines to start out with
- Can see how the review has evolved over time
- The review functionality it provides eliminates the need to chase after individuals and brings in accountability.
Seamless translation: localizing and translating content does not have to be a painful process! creating translation workflows are easy to keep track
Omnichannel publishing: Think of content as a service. By using all of the population options we have through AEM for XML we can cater towards all of the different audiences we need by taking advantage of all of the output pre-sets available right in the solution.
Product #2: FrameMaker
FrameMaker is a market-leading software for authoring and publishing content. FrameMaker has stayed a pioneer day in and day out even with its age. FrameMaker provides technical communicators with a powerful and intuitive desktop authoring tool that supports authoring in multiple standards including DITA XML, DocBook, S1000D, custom XML, and template based FrameMaker. Through FrameMaker, we have the functionality to:
- Collaborate with subject matter experts using online review capabilities.
- Take our content to global audiences with support for XLIFF. If you want to work with vendors for example, the XLIFF standard is what everybody works with (Let’s get with the program!)
- Publish from the single source as best-in-class WYSIWYG PDFs, responsive HTML5, and more for any screen!
Wortimla notes that this is not a comparison presentation, “If we have these two tools, how can we pull the best use out of both to create the best experience?”
Right from within FrameMaker you can create any topic type that you require (topic, task, concept, reference, you name it!). Once you are inside FrameMaker working with your content, it does not matter how complex your DITA map is. You can work with anything you require due to FrameMaker’s robust nature (Use shortcuts! Work with the structure view! Do all the things!)
Wortimla suggests, let us say that FrameMaker and AEM for XML got married (When? No invite?) How do these two programs really come together? It is simple. Through FrameMaker, we can get the best of both worlds.
- Use the out of the box connector to swiftly connect to AEM.
- Easily browse the AEM repository directly within the FrameMaker repository manager. All the capabilities from the AEM side come in and flow through your FrameMaker UI.
- Use the context menu to upload, check in, check out, etc. the files you need. All the content will automatically flow through the DAM back onto the AEM side automatically.
- Search the Adobe Experience Manager repository. Right from within FrameMaker you have advanced search options that allow you to find exactly what you need with full text search, specify dates, file types, tags, etc. (No detective work required, yet all the options!)
- Check in your files when you are ready, add a check in comment, and you are good to go.
- Open your DITA content in FrameMaker from the Adobe Experience Manager option available.
For publishing, specifically when it comes to publishing through PDF, you may currently depend on FrameMaker Publishing Server which allows automated publishing using the STS files you have, or DITA Open Toolkit (works well once already developed, but not that friendly). From within FrameMaker if you want to create and push out content though PDF or responsive HTML5 for example you can publish powerful and flexible options right at the source. Plus, FM provides you with the true what-you-see-is-what-you get experience (no surprises). For HTML5, output is highly customizable, and responsiveness is right out of the box.
Wortimla provided us with a detailed demo that showed:
- How to connect to the repository
- Access and use the repository manager
- DITA map and topic navigation/functionality through FrameMaker
- Useability and search functionality through XML view for those comfortable with the view
- Open, view, and edit directly in XML for AEM
- Output generation pre-sets with the option to change the layout, structure, and more by configuring the STS settings. If you want to work with DITA templates for example, this is all available through FrameMaker’s DITA templates tab under the output specific publish settings.
Wortimla’s demo neatly tied together all the information she presented connecting XML for AEM with FrameMaker and showcased how these programs compliment each other. The ease of use and ability to create original content publishing options nearly speak for themselves.
Here are the five quickfire points Wortimla left us with on the benefits of using FM with XML for AEM:
- Familiar use interface for authors
- Multi-format/multi-standard authoring for gradual migration
- Tight two-way integration with adobes enterprise CCMS
- WYSIWYG pdf and easy to customize, responsive HTML5 outputs
- BONUS: Offline editing (which may be a requirement for some!)
Sounds good? Sounds great, actually!
A few of the notable questions from the Q&A pod:
Jignesh Shah asked: Is AEM a part of Adobe Tech Comm Suite? The answer: No, Adobe Experience Manager is a separate product (FM and AEM are separate licenses). To find out more, click here! Stefan also noted that more information about XML documentation for Adobe Experience Manager, you can check out this resource as well!
Another great question from Jignesh Shah: When you publish Help to any format like HTML5, PDF, etc., does it create a .zip file or does it publish to a desired location on a computer? Matt Sullivan answered: The Publish pod gives you freedom over where and how your output is delivered. No ZIP unless a format happens to be a “ZIP” container by design.
One more great question from Jignesh Shah: In AEM, you said it allows multiple authors to work simultaneously. Does that mean that multiple authors can work on the same topic at the exact same time, with multiple cursors like in Google Docs? Stefan’s answer: Not for authoring yet, but for Review! Multiple reviewers can comment on a topic simultaneously and even chat with each other in the comments.
|## Our Journey from DevOps to PubOps!### Implementing PubOps (POPs) to test and automate your DITA workflowsCharissa FleischerSenior Manager at Palo Alto Networks, USASteve SarettePrincipal Technical Writer at Palo Alto Networks, USA|
Charissa Fleischer, Senior Manager of Technical Writing at Palo Alto Networks, and her colleague Steve Sarette, Principal Technical Writer, shared their journey of implementing PubOps (POPs) to test and automate their DITA workflows.
POPs, short for PubOps, which is short for “Publications Operations”, is the adaptation of development methodologies for their technical publication organization. Their team consists of members with diverse backgrounds that all help with API documentation.
The challenges they faced were the many documentation errors like misapplied formats, misspells, element misuse in XML, that were often not caught until publishing. This increased publishing costs and slowed marketing. To combat that, Charissa, Steve, and their team at Palo Alto decided to apply DevOps methodologies to how they developed documentation. Doing so would increase deployment frequencies, reduce issues, streamline workflow, and reduce costs. Ultimately, their goals were to
- Create a test harness to reduce errors and maintain high-quality documentation
- Automate repetitive authoring processes
- Automate content creation when it can be derived from another source
- Allow writers to experience writing developer docs
These goals would basically eliminate issues and manual efforts to free writers up to build new skills and improve the content.
Steve argued that a test harness is important for a team to keep their documentation in a good shape. At first, when they were working with DITA-OT, it would spit out complicated error messages that were hard for writers to understand. So, they developed a method to make it easier and realized that they had a lot of security restrictions to overcome in the process. They eventually created a docker environment for them to test the scripts. They also developed a scoring system to measure the quality of their documentation. Essentially when they see a quality issue, like if a topic is missing a <shortdesc>, they would take points off its “quality”.
Steve stressed that this is still in an experimental process, and they are still working towards meeting the security requirements at Palo Alto. These are the step to how they created the Test hardness:
1. Everyone had to learn Python!
2. Designed a scalable and portable test framework
3. Talked to IT to identify corporate security requirements
4. Broke everything up into modular components with clear inputs and outputs
5. Integrated with source control
6. Defined a scoring system
7. Determined quality checks
8. Began developing checks
According to Steve, having a test harness led them to the ability to automate content creation, which reduced a lot of manual labour. Charissa and Steve then showed us a few examples of how they authored the content and how the editing is automated in the process.
- Make sure to ID security requirements with IT before diving in
- Encourage API writers and other scripting enthusiasts to strengthen skills
- Charissa recommended starting a study group to help everyone learn together!
- Be patient and set timeline expectations (remember that everyone has different own responsibilities)
- Document everything (for future references)
- Define your testing environments and communicate
- Share your accomplishments to keep others interested in your work
To conclude, Steve emphasized once again that even though what they did was specific to their team, every company that uses DITA should have a test harness. According to him, once you go down the path, the opportunities for automation are endless! Scripts are relatively inexpensive to create and even get cheaper as you become more experienced. Scripts help eliminate costs and manual labours and eventually speeds up your production processes.
If you have questions, feel free to connect with Charissa or Steve to chat more!
A few of the notable questions from the Q&A pod
Q: Have there been any issues with team members (especially writers) feeling overly scrutinized by the scoring system? Have you looked at using NLTK (Natural Language Processing) to inspect content as part of the scoring?
A: No one has been “overly scrutinized” because they made sure managers do not use the scores as a weapon against technical writers. They tried their best to create a positive working environment for everyone.
Q: Do you need to localize content?
A: Yes, but not all content. Localization is only conducted for a few products. They also built some scripts to streamline the localization process for certain teams.
|## The Value of User Centric and Quality Content### The importance of developing quality and user centric content as part of your business strategyBerry BrasterTechnology Director, Solutions & Technologies at Etteplan, Netherlands|
Berry Braster, Technology Director, Solutions & Technologies and Bekker, Senior Solution Architect (both at Etteplan, Netherlands) provided us with another great customer story. Their topic at hand: The value of user centric and quality content. They started off by focusing on the importance of quality and user centric content through these macro trends:
- Digitization (technology leap)
- Urbanization and demography
- Energy transformation, sustainability
A user centric content optimization strategy is the one that puts your target customer or reader in the focus. The digitalization trend example was shown to showcase how the industry grows and requires the content to grow with it: Industry 4.0 requires information 4.0.
Content 1.0: Information as part of the product.
Content 2.0: Multiple information products from a single source.
Content 3.0: Integrated information from multiple sources as one content management base.
Content 4.0: Information as objects that can be used in any format and context.
A proactive technical information approach is needed for the future and to solve existing challenges and issues. The trend today is to create more and more content on an online platform. The main problem is a lot of the content is still created inside those business units. To meet this demand, departments must be aware that content can be reused and shared once future proof. This is through an AS-IS and TO-BE system:
- AS-IS: push information from legacy systems. Not aligned or integrated.
- TO-BE: Event user and/or context driven pull for tasks specific information.
DITA is a source for user centric content. Through DITA, the technical writer learns to not create just a document, but a rich topic/script. This “script” would then be used allowing pieces to be taken to publish the content audiences need. Etteplan’s visual on screen depicts this well as one DITA document is surrounded by chatbots, documents, presentations, videos, web content, manuals, and more.
The technical information maturity steps were shown through BOON EDAM as an example. They were able to work from step zero: uncontrolled documentation format (tech info as part of other jobs, more random approach) to step five: a dynamic communication process that fit in with their business needs (automated triggers and processes on content in context, tech info as information nuggets). This was a gradual process, but by having the goal of getting from step zero to step five constantly in mind they were able to achieve this level of maturity. Authors need to move from creating 100% of the content to more than 100% of the content to create this user centric content.
At the 1-5 maturity levels outlined before, the DITA to PDF publishing output sits at level 2 of the maturity scale. Multiple information products from the same source: the content can be immediately published to a mobile device or a web browser. At maturity level 5, automated triggers, and processes on context and in context. Information is generated dynamically based on a trigger. Thanks to the writer creating more than 100% of the content, the right information can be published at the right time for the right platform.
Berry then provided a demonstration to see what was presented in practise. This demonstrated how they can create content that is easy to work with and understand as well as ability to cut translation costs (even just at level one or two of their maturity list!) Berry also worked through the FrameMaker plugin HyperSTE (their language checker tool) and its functionality to provide consistent content development help. HyperSTE can check pages, paragraphs, page ranges, etc. HyperSTE then colour codes the information allowing users to reuse approved sentences and further improve the document.
So, how do you implement content quality?
- Standardize your terminology and define your style guide.
- Train your content developers (writers, but also SMEs)
- Analyze your content to define a roadmap and milestones
- Use HyperSTE to assist with content development, embedded in your content workflow, integrated in your authoring and CMS system.
The bottom line: it is your choice to create user centric content or not. However, at the end of the day, your customer base will read the content you produce and create an opinion on your product based off it (Got it? great!)
|## Transforming Content### Breathing New Life into Legacy ContentTorsten MachertSenior Consultant at Congree, GermanyVal SwisherCEO at Content Rules, USA|
Val Swisher, CEO at Content Rules, is back today to share more of her wisdom! This time with Torsten Machert, Senior Consultant at Congree, to talk about the importance of maintaining consistent structure and style in a controlled environment.
If you have ever heard Val present before, I am sure you are already familiar with the importance and benefits of structured authoring. It makes it easier for readers to find information and for authors to manage and reuse the content. To be successful when mixing and matching content, the topics must be consistent in terminology, grammar, style, and tone.
With standardized terminology, you can personalize customer experiences and save time and resources in authoring. It also frees up time for marketing, and at the same time, promotes a consistently positive brand image.
- You can develop a terminology spreadsheet with preferred terms, prohibited terms, acceptable terms, definition, and usage notes. According to Val, if you go with this approach, the most important section in this spreadsheet is the terms that people should NOT use.
- Another method, which is what Val recommended that we use, is to create a Termbase. A Termbase is a database that contains terms and related information like metadata and part of speech. It is more comprehensive and consistent than a spreadsheet.
There are also two methods to use terminology:
Pull (no tools)
- With this method, basically, writers would have to define every single term for future references. To this, Val argued that no one has time to do this. No one.
Push (using tools)
- Check content using software, usually with one click. This method is easier and drives consistency.
Style is an intentional decision you make. Unlike grammar, it can be correct or incorrect, but the important thing is that it stays consistent. Consistent style unifies content and allows for easier reuse. However, Val argued that your style would not matter without accurate grammar. So, make sure you have consistent and correct grammar. The best sentence structure is the simplest, most accurate and translatable structure. Your consumers are not your B.F.F.s (best friends forever)!
Sure, many companies have a style guide, but no one has the time to use it. So, there is only one solution to ensure consistency – automation. Enterprise content optimization software is highly configurable, run in almost any authoring environment, catch all the errors, and keep everything consistent. With that note, Val passed the “torch of wisdom” to Torsten for him to talk about the Congree Authoring Server.
In the beginning were unstructured format and overwritable styles. Document structural consistency depended on the author’s training and discipline, and content reuse was simple copying-and-pasting. To combat that, structured authoring is used to break consistent structure and design away from authors to make content reusable. It also facilitates information exchange to prevent information silos and makes multi-channel publishing possible.
Controlled language also emerged as a subset of a natural language hat has definitions and restrictions for grammar, spelling, and style. According to Torsten, there are a few out there that were developed by big companies to enforce consistent content. Controlled languages improve readability, eliminate ambiguity and complexity, and improve the quality and comprehensibility of translation.
As Antoine de Saint-Exupéry says in The Little Prince, “Language is the source of all misunderstandings”.
Unfortunately, no one has the time or ability to remember everything in a style guide, so you must automate this process. This is where the Congree Authoring Server comes into play. This tool has three components: language check, authoring memory and terminology component, which together helps drive consistency across all your content.
For anyone who has questions about the Congree tool, feel free to connect with Torsten on Linkedin to learn more!
A few of the notable questions from the Q&A pod
Q: In Congree Authoring Server, how does it connect to an authoring tool? Or is it its own authoring tool by itself?
A: It can be plugged into your authoring environment if they support it.
Q: Can you add pictures into the terminology side of Congree Authoring Server?
A: No, but you can create a workaround like inputting reference links for graphics.
|## Beyond the OT### Leveraging DITA for enterprise content needsSarah O’KeefeCEO at Scriptorium, USA|
Sarah O’Keefe, CEO of Scriptorium, USA, gave us a partner presentation on leveraging DITA for enterprise content needs. She began by urging us to consider our DITA open toolkit options. FrameMaker through AEM is (more or less) built in and available. However, InDesign’s is getting a bit more “out there”.
Extending the open toolkit is cheaper than the alternatives. Extension possibilities include reformatting, extending, and rearranging the open toolkit through proper architecture and plugins (but please be careful to ensure everything stays stable!) If you want to do new things with the open toolkit, you may want to look at GitHub. There are over a thousand repositories with a ton of stuff out there that you can check out. Even though not all of these are constantly maintained to work for the latest releases, this resource is worth looking at!
But what if GitHub is not enough? If you find yourself wanting something more “out there”, something nobody has done before, yet are unsure of how to put it together this is where SCORM can come into play. SCORM is a package of learning content. A HTML with a wrapper. However, SCORM has variants and you need to find out what your LMS accepts.
Regarding InDesign, KBs, and more:
InDesign has its own XML language (idml: complicated & icml: text plus paragraph and character styles, images, but no formatting specifications)
Several commercial solutions are available
In many cases, although there is a lot of demand for InDesign, there is not always a need for it where a program aside from InDesign would suffice. There needs to be a strong justification for pristine print output
- Example: textbook publisher
- Not an example: picky technical writers (there can always be outlier cases!)
Separation of content and formatting is key to get from the Source XML content to the InDesign.
Source XML (content only): The start. Process DITA to generate icml (the easier version) and take an InDesign template where you have predefined all the necessary styles and place your InDesign XML into the InDesign template. This creates a ~98% formatted InDesign file. You then take your InDesign file and do those fine-grained fixes that you came to InDesign for in the first place. The open toolkit plugin allows you to move from the InDesign XML to an InDesign template.
InDesign template: Some work/formatting is required to build out a structured template. Use paragraph, character, table, object, styles, etc.
InDesign: Once the above actions are taken, you can then combine the content and the formatting to create the final InDesign file. Sarah’s advice: Keep the content and the formatting separate for as long as you can. This functionality can get very privacy, very fast.
Can we do this with PowerPoint? Good question. The problem with PowerPoint is that the people who use the program do not tend to love templates. PowerPoint is often used as a text dump which you can not just simply do with DITA. If you are going to use DITA with PowerPoint, you may require 6-7 slide layouts which require the presentation creator to be very specific and structured (which will not suit all PowerPoint lovers needs).
So, can we get DITA into a knowledge base? The answer is yes. Think about KB article: this is more or less compatible with a DITA topic structure. Knowledge base articles usually have specific metadata. The sticking points:
- Where do you do updates?
- How do you connect DITA content to the knowledge base database?
Several CCMS (including AEM) have connectors for KB systems such as Salesforce, Zendesk, etc. Yet, the above question of where do you do the updates still stands. Also, who owns the articles? How do you handle conflicts?
JSON, resource files, and other outliers:
- Text markup is possible through the open toolkit. However, you may need to modify DITA content to provide the right hooks. When trying to render an entire topic (a title + several paragraphs for example), JSON is not the best at understanding “Yes! This is a topic, a unit of content ”. JSON tends to go as far as “Yes, this is a text string.” Utilize the <resourceid> element for context-sensitive help.
DITA to markdown is straightforward. When questioning the specifics regarding formats (what about [insert format here]?), Sarah suggests you consider the following:
- Is there a related open toolkit plugin?
- Is there a text format?
- Is there an API connector?
- How much money do you have?
Best practices to consider:
Either deliver a finished package (like PDF, EPUB, or SCORM) or process content and formatting separately
Downstream editing causes content control problems
- Creates divergence
- Content changes will always sneak in
- Hardly worth the effort
Be very clear about your source (its editable) versus target (delivery only)
Consider if you can invert the problem or not.
- Can you use DITA as an end point instead of a starting point?
- Try to avoid round-tripping. It exacerbates…everything.
Markdown as a source content?
- DITA is an output target
- Same considerations will apply, but now in reverse
- Do not edit in DITA
- Use only as a pass-through to other formats
- Could mix markdown-source content with DITA-authored content
If you are new to DITA, Sarah O’Keefe suggests checking out this Learning DITA resource for free courses and more information.
A notable question from the Q&A pod to wrap up the night:
June Harton asked: Why would subsections get you in trouble in relation to KBs? Sarah’s answer: If you have a DITA topic and the topic has sections inside it, in general the sections can not have nested sections inside of them. This creates those limitations.
Wow! And that is day two of Adobe DITAWORLD 2020. As technical communicators, we were hit with truths and realities today! DITA has played important roles in all our workplaces and its use created great stories of success that virtually anyone can achieve by utilizing the #1, right tools, and #2 right mindset. Both are readily available and widely used by all our great presenters.
Several of our presenters today utilized stories to show how access to these tools have positively impacted their workflows, content, and their audience reception to their content. These stories have all ended in either happy endings or a plot leading them straight to one. However, unlike fairy tales, these stories can become real for you, too!
So, please join us in setting alarms and getting some well deserved rest in preparation for the third and final day of Adobe DITAWORLD 2020.
Topics: Technologies, TechComm, TechComm Import