Skip to main content

Fredrickson Communications

Tony Tao

Tony Tao develops eLearning courses using authoring tools ranging from Adobe Dreamweaver and Flash to rapid development tools such as Adobe Captivate and Articulate Studio. He also works with Fredrickson’s clients in roles that include instructional design, content development, visual design, and project management.

Before joining Fredrickson Communications, Tony worked as an instructional designer and training specialist in several organizations and companies in both the US and China. Tony received his MS degree in instructional design and training at St. Cloud State University. His research focused on asynchronous eLearning practices and development tools.

Articulate vs. Captivate:  The complete series.

by Tony Tao, Instructional Designer and eLearning Developer

Editor’s Note: This Articulate vs. Captivate article originally ran as a series of entries on the Fredcomm Blog. Because of the popularity of this series, we’ve combined all of the blog entries into a continuous article to make it easier to read.

Articulate vs. Captivate Part 1: Comparing popular rapid eLearning development tools.

With the rapid eLearning development tools becoming prevalent in the market, course development is getting faster and some aspects are getting easier and less costly. Among the many eLearning rapid development tools on the market, Articulate Studio and Adobe Captivate have become the most popular and widely-used among our clients.

As an eLearning consulting company, we are often asked for advice on which is best, Articulate or Captivate? This question is often asked by corporate learning groups who want to choose a standard tool for use within their company or group.

I want to note here that when I refer to “Articulate” in these blog entries, I’m referring to the full Articulate Studio package. While it is possible to buy individual Articulate products (like Articulate Presenter), I don’t think this makes sense for most needs because without the full Articulate Studio, the functionality and results would be limited.

So which is better, Articulate or Captivate? Of course, there’s no clear way to answer this question except to say “it depends”. Both tools work well in different areas and for different reasons. I’ll start this series of blog entries with the things that both Articulate and Captivate have in common. In upcoming entries, I’ll look at what each tool does well and not-so-well.

I have to add that the skill and experience of the developer does still matter. These tools are often purchased with the expectation that anyone will be able to use them to create great eLearning courses. The problem is that as developers and learners have demanded more sophistication from the courses that these tools produce, the number of features and the complexity of using these tools has increased with each new version. Whichever tool you choose, there is no substitute for knowing how to use it efficiently and effectively. The more skilled and experienced you are at using these tools, the better your results will be.

Since I’m a developer, I can’t resist starting with ease-of-development. From this standpoint, both tools are relatively easy to jump into (at least at a basic level) without extensive coding knowledge or formal training. Basically, developers use the built-in templates to build courses by adding written learning content, creating interactive components, and then adding audio, and so forth. The templates take care of the user interface, the navigation, and other features so these don’t have to be built from scratch as they would if you were developing using other technologies like Adobe Flash.

Both Articulate and Captivate have a number of features in common:


  • Quiz development – Both tools can develop quizzes with an assortment of question types to cover different needs and to provide variety.

  • LMS connectivity features – Both tools have features that allow the developer to define the LMS connectivity settings for the published course and then to save these settings. As with anything to do with an LMS, how close these settings get you to plug-and-play connectivity with your LMS will vary, but it’s still a significant advantage compared to developing courseware in other technologies.

  • Flash-based output – Both Articulate and Captivate produce Flash-based courses that play in a standard browser (of course, Adobe’s Flash Player must be installed). But even in this similarity, there is a difference to note. Captivate publishes courses in a single SWF (Flash) file, whereas Articulate publishes the course as a “package” that includes multiple SWF files in a pre-defined directory structure. There are some advantages and disadvantages to each approach and I’ll get into these when I discuss the specifics of each tool in future entries.

  • Learning interactions – Both packages can produce low to moderate complexity learning interactions and both can support branching. Of course, the type of interactions, the sophistication, and the ease-of-development varies with each package.

  • Skins, color schemes, and interface customizations – At a basic level, both packages allow user interface (UI) customizations. The developer can change color schemes, button labels, turn on /off certain features, and can change other UI elements. In my experience, the UI customization that users are most interested in is the ability to change color schemes to match corporate or group branding standards. Both of these packages offer enough options to keep most users happy in this regard.

Now we come to the point where the tools start to diverge. Articulate and Captivate work differently and each tool has advantages and disadvantages when it comes to certain features and uses. To understand which tool is a better choice, you need to consider the tools in light of you or your organization’s needs, and the types of training you develop or intend to develop. You also need to consider the developer skills you possess or, in the case of a corporate learning group, the skills you have available on your team.

In the following entries, I’ll walk through what I think are the key functions of each tool, the types of training that I think they work best for, and finally I’ll give some thoughts about developer skills, publishing and deployment concerns, and other considerations.

Articulate vs. Captivate Part 2: Exploring Articulate Studio

In the previous section, I started to explore two of today’s most popular eLearning rapid development tools—Articulate Studio and Adobe Captivate. Now I’d like to talk about each of them separately and in more detail, starting with Articulate Studio. In the process, I’ll also discuss some of the best practices that may help with your development.

Just in case you’re new to Articulate Studio, I want to mention that there are four main components: Articulate Presenter, Engage, QuizMaker, and Video Encoder. If you need info or a refresher on what each component does, have a look at Articulate’s website.

Let me start by asking you a simple question: What is Articulate Studio?

The answer I most often hear goes something like this: “Articulate converts PowerPoint to a Flash presentation.” Technically, this is a true statement and it’s one of the factors that attracts many people to Articulate in the first place—it doesn’t require much in the way of programming skills to jump on board. Although using Engage and QuizMaker requires more practice, most users can get familiar with these Articulate Studio components in a short period of time.

For those shopping for rapid eLearning development capabilities, it can seem as if all you need to develop a good course is PowerPoint content to run through Articulate and out comes eLearning. This is an especially attractive proposition for those who are tasked with “converting” instructor-led training courses to be delivered as eLearning.

The problem that I hear over and over from both eLearning developers and actual learners is that the “PowerPoint look” of Articulate courses wears thin very quickly. Something’s missing, but what?

To answer this question, I have to stray a little from talking about tools and take a quick dive into instructional design. As you probably know, the traditional use of PowerPoint is in classroom-based training, which is also called synchronous or instructor-led learning. By contrast, Articulate eLearning courses are, of course, an asynchronous (self-paced) learning experience.

You probably see where I’m headed already: even if the course contains the same content, we have to take quite different approaches once the delivery medium changes. To substitute for the richness of activities and interactions that can take place in the classroom, we need to build a new layer of richer interaction and engagement on top of the content in the PowerPoint in order to make it effective as an eLearning course. When this layer is missing, people see the course as a shallow PowerPoint presentation, not as real learning.

I know that this problem is not just an Articulate Studio problem, but because of Articulate’s direct link to PowerPoint, it seems even easier for Articulate users to fall into this trap. Remember, a PowerPoint presentation is only one ingredient. One ingredient doesn’t make a cake.

Fortunately, Articulate Studio gives plenty of options to produce a richer eLearning course that goes beyond PowerPoint. For example, Engage interactions, quiz questions, Flash movies, and even customized Flash games. In addition, Articulate allows you to deliver your content through branched scenarios, which is another effective tool to keep learners’ attention.

Articulate Studio offers a lot of eLearning potential in one package. I’m not going to do a feature-by-feature list here—you can easily get that information elsewhere. Instead, I’d like to highlight just a few of features that I think are significant and either little-known or not often used to their potential:


  • Articulate’s QuizMaker tool offers a plenty of new features to enhance the learning experience. For example, you can insert a blank page to deliver more content or the background story in order to set up a scenario. Also, with the “Slide View,” you can adjust the location of your question, the choices, and the related graphics.

    You may also use the drawing tool to create simple graphics or add special treatment to the existing graphics. The new timeline feature allows you to adjust the timing on all of the elements on the page. For example, you can synchronize your choices with the audio.

  • With the annotation tool in the Presenter, you can easily add professionally-designed annotation shapes and spotlight effects to your presentation. This is extremely efficient and effective when creating software demonstrations with highlighted areas.

  • The source file management is easier than ever before. The “Send to Articulate Package” function packs everything you need, including the PPT deck, audio/video clips, and even the attachments, in a zip file. This makes it very convenient to hand off the project to client or to a different developer.

After this discussion of my favorite features, I feel I have to deliver a brief word of warning. I’ve been using Articulate for about 7 years now and the product has evolved significantly. Many people used to see Articulate as a simple tool that would enable anyone to develop eLearning. This may or may not have ever been true, but what has happened over time is that eLearning developers and instructional designers have demanded more and more sophistication. And Articulate has largely delivered, but this means that to get the most out of Articulate, you have to be more and more skilled as a developer to take advantage of the richer features. Therefore, I think it’s best to look at Articulate as a “development suite” and the results really are closely linked to the developer’s skill and the instructional designer’s understanding of how to design learning to take advantage of Articulate’s strengths.

Since most of the Articulate courses involve an audio presentation with closed caption text, it requires a different design approach in PowerPoint. Research indicates that when audio and static text are presented at the same time, audio is the most dominant and efficient channel. Therefore, it’s often a distraction if the bulleted text repeats the audio. In many cases, it’s more effective to replace bulleted text with graphical elements like photos, illustrations, and flowcharts, and animations.1

In the previous section, we talked briefly about software training. Can I use Articulate to develop this training by itself? Again, it depends how and what you want to achieve in the training. If the training only involves demonstration, you can insert a series of screenshots on the PowerPoint slides, and then spice them up with the annotation tool in Articulate. Gerry Wasiluk posted some excellent information on this topic as comments to my first Articulate vs. Captivate blog entry.

Or, you may opt to use one of the screencasts tools, for example, the Screenr. With these tools, you can easily export your screencasts to video clips, and then insert it into your Articulate course later. However, if you want to drop in a comprehensive simulation in your course, I would say that Articulate is not your best option. If software simulation and is your goal, you should consider Captivate, which I will cover in the next entry in this series.

1 Of course, a transcript should be available so that learning content can be accessed by those who cannot hear the narration.

Articulate vs. Captivate Part 3: Exploring Adobe Captivate

In part two of this series, I explored Articulate Studio in more detail. Now it’s time to do the same with Adobe’s Captivate.

Captivate is a comprehensive rapid eLearning development tool for creating software demonstrations, interactive simulations, and quizzes. Compared to Articulate Studio, Captivate offers a better workflow to take the developer from screen recording to the process of interaction building. Most Captivate projects follow the “see it, do it” approach. In the “see it” segment section, the learners watch a recorded demonstration. In the “do it” segment, the learners complete a series of tasks in the simulated environment — for example, adding information to a customer’s account.

Like Articulate Studio, Captivate provides the users with some essential functionality, such as customized skins so that the look and feel can be modified. It also offers text/graphic animations, audio synchronization, interactive components, and publishing options for both web and LMS delivery.

Let’s take a closer look at these features.


  • Customized skins. Both Articulate and Captivate offer the flexibility to customize the “skin,” which is the user interface of the eLearning course. In Captivate, the developer has more options to choose different control bar from the gallery, and then perform further customizations with different color schemes. You can also create your very own project skin from scratch, either by developing it in Flash, or by building it in the Captivate Master Slides (available in CS5 and higher).

  • Animation. Unlike Articulate, which builds the animations in PowerPoint, Captivate creates all types of animations with the “effect” function on the Flash-like timeline. In the earlier version of Captivate, the animation types were limited to fade in/out and animated text. Starting from CS5, more animation can be applied to any object, such as a caption box, a graphic, and/or a drawing. The functions also offer precise control on the timing of an animation—for example, having a box fly in from the left of the screen at exactly 14.5 seconds.

  • Audio synchronization. To synchronize audio in Captivate, the best approach is to use the timeline. This might be challenging to the users who are not familiar with timeline-based applications, such as Flash or Premiere. Comparing Captivate directly to Articulate, the initial synchronization process could take longer in Captivate. However, it is a lot easier to adjust the synchronization in Captivate. For example, if you later decide you want a caption box to come in a little earlier, you can precisely adjust the timing of this object without touching anything else. Articulate requires a user to re-synchronize the whole slide, which is much more time-consuming.

  • Interactions and branching. Instead of using a pre-built template such as Engage, Captivate creates its own games and interactions by using rollover captions, buttons, and slidelets. Starting from CS4, Captivate introduced variables and ActionScript. This allows the developer to create more complicated learning activities within Captivate. Of course, advanced programming skills are required to perform this kind of development, so again we see the trend of these rapid development tools becoming more like “development suites.”

  • Publishing for both the Web and the LMS. Like Articulate, you can publish your Captivate project for both Web and LMS delivery. Your project, including the audio, the text, and the interactions, are compiled in one SWF file. The playback skin, animations, and widgets can be exported into separate SWF files within the delivery package. If your project contains too many slides or too much audio, the loading time will become a major issue. In this case, you may consider splitting your course into smaller modules, and then binding them with Adobe Aggregator. Another well-publicized issue is AICC compliance, but it seems that Captivate has resolved this in version 5.5.

To enrich the functionality of Captivate, Adobe has developed some add-on applications, such as text-to-speech, widgets, a review tool, and a quiz result analyzer and aggregator. Developers can find even more add-ons from Adobe Exchange server. Articulate has a similar online community, and encourages the developers to submit their customized interactions.

The main difference that I have observed between the two online communities is that the Adobe Exchange community tends to be more willing to share code and methods for free. Of course, these are often just the starting point, the developer then needs to finish the object. The Articulate community members, on the other hand, will often offer finished enhancements such as interactions, but because these are finished objects that took larger amounts of time to create, the members often want to charge a fee.

After comparing Articulate and Captivate side-by-side, we have seen a lot of similarities and a few significant functional differences. One of the biggest differences I can highlight is the development process and the mindset it takes to get the most from these tools. In the next section, I will conclude this Articulate vs. Captivate comparison series by discussing my views of the circumstances and uses where I think each of these tools excel.

Articulate vs. Captivate Part 4: And the winner is. . .

In the previous parts of this series, we have explored the major features of Articulate and Captivate, and discussed the strengths and limitations of each tool. Of course, there really isn’t a winner. As I wrote at the beginning, the only answer to the question “Which is better?” is “It depends.” The tools have different strengths and the best fit depends on your needs.

And for larger organizations or those with more complex or varied learning needs, the answer to the question “Which should I buy?” is often “Both.”

Here’s a summary chart that I think clearly highlights the strengths of the two tools. Of course, some of these items can’t be reduced to a simple yes-or-no answer, so in some cases this chart simply reflects my opinion.

In 2012, we will see new players joining the rapid eLearning tool game. For example, Articulate Storyline and ZebraZapps are already attracting a lot of attention. There is also the possibility of new releases of Articulate Studio, Adobe Captivate, and SmartBuilder.

One of the interesting trends that we have noticed is the rise of mobile learning, and how the rapid eLearning tools are quickly incorporating functionality that gives them the potential to create mLearning content. For example, most of the new tools can publish your project as HTML 5 or in the mp4 video format. This gives eLearning developers an easier path to get a course running on Apple mobile devices such as the iPad.

I expect to see more projects developed with these new tools in 2012 and I will be using them myself for Fredrickson’s Learning business. As always, I’m glad to share my thoughts and findings with you.

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Learning Trends - Where will they lead in 2011?

by J Hruby, Vice President, Sales & Marketing

With contributions by:
John Wooden, Director of Usability Services
Tony Tao, Instructional Designer and eLearning Developer

It’s 2011. Has my jetpack arrived yet?

Introduction by J. Hruby, Director of Marketing

Learning and development, like any other business, has its trends, innovations, success stories, bubbles, and busts. As we begin a new year, it’s fun and interesting to look at the landscape and take stock of the trends and technology.

And who can resist thinking about and trying to predict where these trends and technologies might take us in 2011?

Making predictions is fun, but predictions can also be informative. In trying to determine where a trend might take us, one has to look at the reasons why it might take us there. A lot can be learned from listening to the give-and-take conversation that weighs both the promised benefits and the foreseeable hurdles.

As anyone who has ever looked at futurist articles like the ones that were a mainstay of Popular Science in the last century knows, it’s notoriously difficult to correctly predict the future. In fact, it’s tough to even get close. Anyone commute to work today using a jetpack? Unfortunately, I think not. Setting aside the fact that it would have been a pretty chilly commute (presumably, the future was supposed to be warm), there just aren’t many jetpacks out there compared to what was predicted. Most of us aren’t using personal helicopters or hovercraft either. Now that we’re in the future, we just don’t fly as much as we were supposed to.

It’s good for us to remember that the future is slippery because today’s logic and limitations don’t apply to tomorrow. We also tend to forget that adopting and adapting to new things takes time even today. This fact alone seems to provide a stark (and perhaps ironic?) contrast to our 21st century expectation that most things should happen instantly. Or sooner.

Here are a few trends that some of my fellow Fredcommers and I are watching in 2011.

The iPad as a learning delivery platform

Prediction by J. Hruby, Director of Marketing

Prediction: Despite a lot of talk about the potential, adoption of the iPad as a learning delivery platform will be slow in 2011. It’s not all gloomy for learning professionals who want to harness the advantages of the tablet computer, however. Tablet competitors to the iPad will rush into the market in 2011, bringing more choices and lower cost.

Of course there’s been lots (and lots, and lots) of talk about the iPad’s potential as a slick portable delivery platform for learning. Beyond talking about it, though, I think actual enterprise learning adaptation will be very slow in 2011. No doubt, the iPad is a cool tablet computer, but as an enterprise learning platform it has many hurdles to overcome.

First, there’s the iPad’s current inability to fully utilize Flash. Apple must have reasons for this anti-Flash direction, but introducing the iPad with what I’ll charitably call a glaring omission makes it especially unattractive as a business learning platform. At the very least, existing content that uses Flash components would need to be carefully tested and parts that don’t work would need to be re-developed specifically for the iPad. Developing new learning content destined for the iPad will also have an added layer of complexity due to Apple’s Flash-unfriendly stance. All of this adds up to a hassle factor that most learning content owners and developers don’t need.

Second, there’s the cost. As of this writing, the iPad starts at $499 and rises rapidly to a stratospheric $829 for the 64 gig version with 3G WiFi. Ouch! The iPad’s pricing obviously hasn’t presented a barrier for individual consumers, but that’s not a great benchmark. In a more conservative corporate environment, a proposal to buy numerous mid-range iPads at a cost of $600 each to deliver mobile learning to, say, field salespeople will likely receive close scrutiny to say the least.

The broader trend in 2011 seems to be the emergence of tablet computing alternatives to the iPad. If a learning initiative is hung up by the cost of the iPad, how about a very capable $300 Archos 70, running the Android OS instead? This emergence of the Android tablet PC is one of the biggest trends of 2011 and CNET’s portable electronics correspondent, Donald Bell, correctly predicted a blizzard of new Android-powered tablet launches at the 2011 Consumer Electronics Show in January. The New York Times also has covered this Android tablet invasion.

This introduction of lower-cost iPad competitors can only be a good thing for cost-constrained learning groups eyeing the possibilities that these portable devices offer. I can see the iPad featuring in high-profile or high-ROI learning initiatives where the iPad does double duty as a reward or incentive. This approach seems to leverage the best feature of the iPad: the cool toy factor.

I also recently had a conversation with a forward-thinking learning professional who successfully justified the purchase of iPads for a learning initiative based on a combination of cost savings and environmental benefits. I took this anecdote as a sign that even in the current business climate, there will be some room for learning use of the iPad and increasingly for other tablet computers, but I think the real shift will occur as people acquire tablet computers for other business uses and learning can then just focus on developing learning products .

Social Learning

Prediction by John Wooden, Director of Usability Services

Prediction: 2011 will see the continuation of a multi-year trend toward more widespread adoption of online social learning in corporate enterprises. Organizations that have not yet implemented tools to allow for online social learning will do so, and those that have will begin to confront some of the technical, cultural, and behavioral challenges these tools pose.

Social learning is usually understood to mean social media applied to organizational learning, either independent of formal learning content (a company-wide wiki or employee knowledge-networking site, for example), or integrated into formal eLearning and instructor-led training (a course blog, wiki, or discussion forum, for example).

One reason why more enterprises will enable online social learning is the enormous popularity of social media and the expectation of many younger employees that they will be able to use social networking tools in the enterprise to ask questions, share their perspectives, and post profiles. But a more important reason for the rise of enterprise social learning is that organizations will want to increase the speed of knowledge transfer – among employees, between employees and suppliers, and between customers and employees.

The competitive advantage of rapid knowledge transfer is only going to become more important in the coming years, and enterprise social media will play a critical role in this, amplifying and extending learning beyond the classroom or eLearning course, allowing employees, suppliers, and customers to learn by connecting with each other in a wider circle than would otherwise be easily possible.

Because of this perceived business value, more organizations will begin to implement or further develop their social learning infrastructures in 2011. More organizations will also begin to confront the technical, cultural, and behavioral challenges posed by social learning. We will see a shift from the excitement – and perhaps inflated expectations – that come with initial adoption, to a problem-solving attitude.

If Step 1 in implementing a social learning infrastructure is to get the tools out there, Step 2 involves helping employees become effective social learners – and this step has been overlooked by a good number of organizations. While many people understand the basic mechanics of how to use social networking tools – because many are already using Facebook, YouTube, and LinkedIn – a lot of employees understand much less about using enterprise social networking tools to support business and learning objectives.

The next few years will see more Learning and Development departments realize that they need to take charge of “engagement training” to help employees – and help their organizations – better understand what social learning is and how it can work. It’s perhaps ironic that L&D departments will employ some old-fashioned training techniques to get people to understand how to effectively use the new-fashioned stuff, but this will happen. For example, employees need to learn what makes for an effective online community, a compelling blog post, a useful profile, and so on. They will need to learn when information is better shared on an internal forum or a community site than through e-mail, or posted on a wiki rather than saved to a folder on a shared drive.

Another issue that organizations will begin to confront is how to reconcile various social tools with each other, with their respective enterprise learning management systems, and with their enterprise search capability. For two interesting perspectives on this issue, see Dave Wilkins’ lengthy but interesting blog post “A Defence of the LMS (and a Case for the Future of Social Learning)“ and Dan Pontefract’s reponse, “Standalone LMS is Still Dead (rebutting & agreeing with Dave Wilkins).”

All I will venture to predict here is that this issue will heat up over the next year – fueled by LMS vendors and social software vendors – but it will be far from resolved in 2011.

Online Learner-generated Course Reviews

Prediction by J. Hruby, Director of Marketing

Prediction: 2011 will be looked back on as the year where the concept of learner-generated reviews takes hold in enterprise learning.

Almost every website where you can buy something also offers the chance to both create and read customer-generated reviews. Why can’t we have the same function for enterprise learning products as well? The ability to choose courses by reading what others think of them and then leave our own review comments would certainly provide a wealth of useful information.

Good question! I’ve heard of several forward-thinking learning professionals who are trying to do just that — offer their learners the ability to leave reviews about the courses they take that can be seen by other learners.

I think 2011 will be the point where this trend starts to go mainstream. This is a very useful and relevant technical direction for both learners and learning development professionals and I think 2011 will be looked back on as the year it really took hold for a number of reasons.

First, let’s look at this from the “consumer perspective,” i.e. from the learner’s viewpoint. Reading online reviews has become a major step in the consumer buying process. We value the opinions of people whom we perceive to be our peers and this certainly applies to the process of trying to decide which learning products are worth consuming and why. Even in the case of mandatory courses, learners want to know what to expect.

Now, looking at the other side, there’s a certain fear factor on the part of the learning professionals that learners will abuse their newfound ability to leave reviews by trashing every course they take. In reality, this fear never seems become a reality for a simple reason: in a corporate learning setting, people understand that their comments aren’t anonymous. Reviewers will keep it in-bounds because they know they’re still at work and work rules still apply. So if we get beyond the fear in 2011, we can move on to the benefits.

And there are benefits. The benefits to L&D professionals should be clear: honest feedback is useful, or at least it should be useful. In reading online reviews of all sorts, I’m inclined to think that when people think of their review as helping people who are essentially just like themselves, they tend to leave more in-depth and meaningful feedback. This, of course, is more useful to anyone who wants to use feedback to make improvements.

There is also the case of user/learner expectation to consider. As my fellow Fredcommer John Wooden often points out, people develop their expectations of technology not just from what they experience at work, but from their much broader experiences outside of work. Over time, disconnects between these experiences become especially obvious. It’s not much of an exaggeration to say that you can leave user feedback and read user reviews for practically every product on the web…so why not for learning products at work?

mLearning

Prediction by Tony Tao, Instructional Designer and eLearning Developer

Prediction: mLearning will continue to move ahead in 2011. User expectations and mobile device capabilities will start to narrow the gap between mLearning and eLearning, but the gap won’t go away entirely.

We first started hearing about mLearning about eight years ago. Given the name, it was easy to assume that this trend would eventually lead to the ability to offer eLearning-style courses delivered over our phones.

The reality, so far, seems to be quite a way from that vision. It appears that the technology and other factors have so far steered the main use of mLearning toward performance support. I want to add that there’s nothing wrong with performance support materials, and offering them on a mobile device is often a very good way to increase overall job performance. It’s just that so far the content and experience continue to make mLearning and eLearning very different media that, so far, serve different purposes.

Mobile phone technology has played a big role in determining exactly how, and how quickly, mLearning grows. The problem is that mobile phones have many different operating systems and capabilities. Some phones offer touch-screen navigation, some rely on keypad navigation. Some devices support Flash, some don’t. Some devices handle web content in ways that make it display and work better on a small screen and some don’t. Beyond the actual device, the capabilities of the mobile phone networks also vary widely.

This lack of common capabilities that the learning developer can rely on makes it very difficult to develop mLearning content that goes beyond text content because it’s almost impossible to know how it will work on the vast number of devices that are in use. This problem is even more apparent to me when I visit my family and friends in China. In China, people change mobile phones very often and the technology infrastructure in China makes it easy to do so.

Of course, one of the things that make people want to change phones is that they see other phones with more features than those on their current phone. In recent years, more and more people in China have switched to smartphones, with the capability of a wireless internet connection. This is mainly because these devices are quickly becoming very affordable. This rapid change makes the feature gap between new and old phones widen very quickly, which keeps the mobile phone application designers busy because the capabilities are always changing.

But in enterprise learning this lack of common device capabilities, I think, has been a big barrier to bringing mLearning closer to eLearning. But will the gap narrow at all in 2011? I think it will.

As anyone can see, there are more and more smartphones in people’s hands these days. Here in the US, it looks to me like we are moving to a point where it will soon be difficult to buy anything but a smartphone. The web browsing experience on these smartphones is getting better and that leads me to think that it will become easier to offer richer learning content that will not be heavily impacted by the individual device. At the same time, faster 3G and even 4G networks allow the developer to building more media-rich mLearning rather than just using basic text.

As J. Hruby touches on in one of his predictions, there’s a new angle to consider in 2011: “m” isn’t just about mobile phones anymore. Options for mLearning now include tablet computers, and another thing to keep an eye on is the arrival of new eBooks that feature wireless internet browsing capabilities and color screens.

Your Turn

Comments on our predictions? Want to make your own predictions on the trends you see in learning and development? We’d love to hear your feedback.

Head over the the Fredcomm Blog where we’ve started a discussion.

Fredrickson Communications eZine - May 2010

by Site Admin,

In this edition of the Fredrickson eZine . . .

Satirical, Yet Oh So True

by Molly Emmings, Account Manager
Fredrickson Communications

Here at Fredrickson, we use the social media tool Yammer to keep each other in the loop on our individual goings-on and accomplishments. We also sometimes use it to give each other a little break in the day where we can laugh. The latter happened last week when Rebecca Kuhlman, our Director of Visual Design, posted a link to an article on TheOnion.com titled Nation Shudders at Large Block of Uninterrupted Text.

“Why won’t it just tell me what it’s about?” One reader asks. “There are no bullet points, no highlighted parts. I’ve looked everywhere – there’s nothing here but words.” I find it funny that an article is condemned for being “nothing but words.

Another reader says, “I’ve never seen anything like it…what does it want from us?

As most of you probably know, The Onion is known for its sarcasm, drama, and ironic humor. However, in this case, the exaggerated reactions described in response to the (fictional?) poorly-written web article are funny because they’re essentially true. This article drives home several important points about writing for today’s audience.

Read more . . .

Mainstream Mobile Devices – The Smartphone’s Impact on Learning.

by J. Hruby, Director of Marketing
Fredrickson Communications

Last month, fellow Fredcomm’er Pat McGuinn sent me a link to a very interesting radio interview featuring Robert Stephens, the founder of the Geek Squad. Joyce Lasecke also mentioned this interview on the Fredcomm Blog.

I listened to the interview and the observations that Stephens makes, and my thoughts immediately turned toward the impact that smartphones and other mobile devices will have on learning and development.

As with eLearning and social learning, the use of mobile technology for learning is already here and it will continue to grow. Beyond the fact that it makes sense in certain instances, the larger force that will make mobile learning a broad reality will be user expectations. We want our day job to reflect the rest of our lives. If I can use my mobile device to learn something in my non-work life, why can’t I use it for learning on the job as well?

Among Stephen’s many observations about the rise of the mobile device, he comments on the following:

  • The benefits of designing for the small screen – simplicity and usability become paramount because of the limited screen size.
  • The trend toward using someone’s online preferences to predict what they will want or need in the future.
  • Everything new becomes normal or mainstream at some point. Conventions around how, when, and why to use any technology evolve and eventually become normalized part of our lives.

See my entry on the Fredcomm blog for a link to the full interview on Minnesota Public Radio. Then I’d be interested in your comments on the impact that the smartphone will have on workplace learning and development.

Listen to the interview . . .

The Learning Leadership Summit 2010 – The Power of Purpose for Learning Leaders

Fredrickson Communications proudly presents the fifth-annual Learning Leadership Summit on Thursday, July 15.

This year, our featured speaker and workshop leader for the Summit will be none other than Richard Leider, the international best-selling author of The Power of Purpose. Richard will deliver a groundbreaking seminar developed exclusively for the Summit: The Power of Purpose for Learning Leaders.

The Learning Leadership Summit is an annual gathering dedicated to the needs of leadership-level learning and development professionals in the Twin Cities and surrounding area. If you’re currently hold a leadership position within a corporate or public-sector learning and development organization and you’d like more information about the Learning Leadership Summit, please get in touch.

Molly Hendricks

Molly Hendricks has been in relationship management since the mid-1990s. She led operations and sales for a specialized information technology training firm before joining Fredrickson Communications in 2003. Her focus is to truly connect with clients, gather information, and communicate in a way that will ensure Fredrickson’s team delivers the best solution based on the client’s need, budget, and timeline.

She started the public-sector peer-to-peer networking group Intersect in 2007, which continues to grow in membership.

She holds a BS in Communicative Disorders and Sociology from the University of Wisconsin at River Falls.

Satirical, Yet Oh So True

by Molly Hendricks, Account Manager

Here at Fredrickson, we use the social media tool Yammer to keep each other in the loop on our individual goings-on and accomplishments. We also sometimes use it to give each other a little break in the day where we can laugh. The latter happened last week when Rebecca Kuhlman, our Director of Visual Design, posted a link to an article on TheOnion.com titled Nation Shudders at Large Block of Uninterrupted Text.

“Why won’t it just tell me what it’s about?” One reader asks. “There are no bullet points, no highlighted parts. I’ve looked everywhere – there’s nothing here but words.” I find it funny that an article is condemned for being “nothing but words.”

Another reader says, “I’ve never seen anything like it…what does it want from us?”

As most of you probably know, The Onion is known for its sarcasm, drama, and ironic humor. However, in this case, the exaggerated reactions described in response to the (fictional?) poorly-written web article are funny because they’re essentially true. This article drives home several important points about writing for today’s audience.

Your attention, please

The thing that anyone writing for any online medium wants from readers is attention. And because we all live life at warp-speed, that attention is measured in seconds or milliseconds. We don’t know what to do when we encounter large blocks of text, other than ignore them or avoid them.

Because of this ultra-short attention span, when we are making a point using electronic media of any kind, we must make it concisely. It sometimes might be emphasized with a photo or a different font; if there isn’t something for readers to grab onto, they move on.

And as they move on, they may also make a mental note not to go back to that same place when they need quick, reliable information. Their trust and your credibility can be shot with just one less-than-perfect experience. So the question is: why risk that kind of reaction to your website, intranet site, or email message?

The principles are pretty simple, yet take some special skills and thought to execute. As Jakob Nielsen wrote in his bi-weekly column called Alertbox 13 years ago (yes, it’s been that long!), people don’t read on the web. They scan the page, picking out individual words and sentences. So, we as web writers must remember to:

  • Use bulleted lists. (See? These bullets got your attention, didn’t they?)
  • Highlight key words. (Got you again.)
  • Be sure we have just one idea per paragraph. Readers will miss any more than one.
  • Use the inverted pyramid style, starting with the conclusion.
  • Be brief. Use just half the words you would for conventional writing.

That said, I think I’ve made my point.

Fredrickson Communications eZine - February 2010

by Site Admin,

In this edition of the Fredrickson eZine . . .

The Learning Organization’s Brand

by J. Hruby, Director of Marketing
Fredrickson Communications

The recent news of the problems at Toyota started me thinking about the importance of brand image—not just for businesses, but for all types of organizations and even for individuals.
If you’d asked three months ago, few would have predicted that Toyota would (or even could) ever find themselves in a position where their carefully-crafted image for quality and reliability was in question. I mean, we’re talking about Toyota here! Things change very quickly in an age of instant and constant communication.

One of the most valuable assets a business has is its brand image—the image consumers have of the company and its product(s) in the marketplace. It’s hard to understate the value of a positive brand image. Beyond the “warm and fuzzy” aspect of being well-regarded, brand image can be a giant business enabler. Because of its image for quality, Toyota has been able to charge premium prices, sell more, and do so while offering less in the way of buyer incentives. These factors combined to make Toyota very profitable in an industry that isn’t exactly known for being profitable at all.

Brand image matters because it influences the perception of value. And who doesn’t want to be perceived as valuable? My musing about the importance of brand quickly turned to the brand of the learning organization and two questions immediately came to mind:

  • If we looked at a learning organization as if it were a company, and the company’s employees and managers as the consumers, what brand image does the learning organization have?
  • What influences the image and the perceptions that others have of the learning organization?

Read more . . .

Leveraging Learning in a Down Economy

A Learning Paths International (LPI) Workshop
March 11, 2010
8:30 a.m. to 5:00 p.m.
Aloft Hotel, Minneapolis

There’s a one-day workshop opportunity that may be of interest to many in the Twin Cities training and development community. Steve Rosenbaum and Ira Kasdan of LPI will present Leveraging Learning in a Down Economy. This workshop will focus on using the Learning Paths Methodology to help participants:

  • Turn employee development into a competitive advantage
  • Get higher productivity and quality from current employees
  • Implement process changes and operational improvements faster and more effectively
  • Quickly get employees fully productive when roles expand or change
  • Capture and transfer best practices before they are lost or leave
  • Drive out time, waste, variability and cost from training
  • Identify and close proficiency gaps in record time
  • Use your onboarding process as a key recruiting and retention tool
  • Dramatically cut the time it takes your salespeople to become fully productive in their roles

    Registration

    For more information and to register for the workshop, visit the Learning Paths International website.

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Recommended Reading

The Experience Economy: Work Is Theater & Every Business a Stage
by B. Joseph Pine and James H. Gilmore

Harvard Business Press
ISBN-10: 0875848192
ISBN-13: 978-0875848198

The Learning Organization’s Brand

by J Hruby, Vice President, Sales & Marketing

The recent news of the problems at Toyota started me thinking about the importance of brand image—not just for businesses, but for all types of organizations and even for individuals.

If you’d asked three months ago, few would have predicted that Toyota would (or even could) ever find themselves in a position where their carefully-crafted image for quality and reliability was in question. I mean, we’re talking about Toyota here! Things change very quickly in an age of instant and constant communication.

One of the most valuable assets a business has is its brand image—the image consumers have of the company and its product(s) in the marketplace. It’s hard to understate the value of a positive brand image. Beyond the “warm and fuzzy” aspect of being well-regarded, brand image can be a giant business enabler. Because of its image for quality, Toyota has been able to charge premium prices, sell more, and do so while offering less in the way of buyer incentives. These factors combined to make Toyota very profitable in an industry that isn’t exactly known for being profitable at all.

Brand image matters because it influences the perception of value. And who doesn’t want to be perceived as valuable? My musing about the importance of brand quickly turned to the brand of the learning organization and two questions immediately came to mind:

  • If we looked at a learning organization as if it were a company, and the company’s employees and managers as the consumers, what brand image does the learning organization have?
  • What influences the image and the perceptions that others have of the learning organization?

The learning organization’s brand

Like any questions that beg feedback that could wander into the category of “painfully honest”, asking about the brand image of the learning organization can take us into potentially uncomfortable territory. But I think taking a step back and assessing the overall image is important for two reasons.

First, because the majority of the feedback that many learning groups receive is focused at the course level. In business terms, this equates to getting feedback on the performance of individual products. There’s nothing wrong with this type of feedback, but it doesn’t tell us anything about the overall perception of the learning organization. Are we really seen as we’d like to be: as problem-solvers and performance-improvers?

The danger in relying on feedback at the individual product/service level is that it doesn’t paint the whole picture. Unfortunately, it’s completely possible to meet the customer’s expectations at the basic product level, yet still cultivate a negative brand image in the process. Want an example? Been on an airplane lately?

Admittedly, the airlines have managed to deliver me from Point A to Point B every time I’ve flown, so they do, in fact, deliver a service that meets my expectations on that basic level. But there’s more to it than that, isn’t there? Did they get me where I paid to go? Yes. Do I have a few comments about the new baggage fee fad? Don’t get me started.

There’s no way to avoid the fact that it’s the overall experience that affects how we view an organization. It’s the same with the products or services delivered by a learning group. What do we know about the overall experience that makes up the bulk of how our brand image is perceived?

The other reason I think it’s important to consider overall brand image is because it’s relatively easy to damage and can be very difficult to fix or change. Customers don’t tend to think about our brand as much as they simply know it. Once they feel like they know something, changing their minds becomes a gradual process that requires a sustained effort over time.

Influencing the brand image

People form their image of a company’s brand by linking together perceptions and experiences to form an opinion of the organization and what it stands for. I can think of several key interaction points that could influence the learning customer’s perception of the learning group’s brand as a whole. Let’s break this down into two key categories – the products you deliver and the overall experience of your customer.

The products
The quality and appeal of the learning products – whether classroom courses, eLearning, or the less-obvious products like consulting, needs assessments, or supporting knowledge-sharing – really matters and contributes substantially to brand image. I’m know I’m not breaking any new ground with that statement, but I think we have to dig deeper than just looking at whether or not the products fulfilled the basic need.

As I already pointed out, the brand image is formed not just by meeting the need, but by how well the need was met. Was it visually appealing? Was it engaging? Was it easy to use? Was it innovative? Was the product interesting or even fun?

Many factors influence the overall perception of a brand. It’s worth mentioning that Apple didn’t invent the digital music player, nor do they have the only player on the market, but the iPod has dominated the category by combining function, ease-of-use, and appealing design. This certainly can provide food for thought for anyone creating learning products. How well we perform beyond fulfilling the basic need is going to be a major driver of our brand image.

The overall experience
As I pointed out earlier, the experience that the customer has while obtaining a product has as much (or more) impact on their perception of brand than the product itself does.

In fact, one of my favorite business books, The Experience Economy, focuses on companies where the experience with the brand makes up a substantial part of the product itself. The book gives a number of examples of how companies make brand experience part of a product. For example, it wouldn’t really be correct to say that American Girl is just a company that makes dolls. I’ve personally seen the line of parents and children in front of their flagship Chicago store, they aren’t waiting in line just to buy a doll. The Apple Store’s café-style approach to computer service or Diesel’s “denim bar” jeans stores are other great examples of the experience making up a substantial part of the brand. Has anyone ever tried to run into an IKEA just for 10 minutes to pick up a few quick items? Good luck! I wonder how many people, if asked, would classify a trip to IKEA more as entertainment than shopping?

So here’s the question: besides taking our training courses, how do our learning customers interact with the brand of our learning organization?

Let’s start with the LMS. Groan! You knew I was going to go there, didn’t you?

Fredrickson’s John Wooden has taken on this topic of the LMS user experience before. In large organizations, the LMS’s learner interface is to learners what Travelocity is to the world of travel, and what Google is to search. The LMS is the gateway to the products or services of many learning organizations. The learner’s brand experience, in many cases, starts with the LMS interface.

The LMS provides a great illustration of the importance of the overall experience when it comes to the image of learning professionals, but there are other examples. These days, how (and how well) we communicate and interact with others online has a major impact on our brand image either as individuals or organizations.

How does the learning organization’s intranet presence measure up? What tools are being deployed to support fingertip knowledge? Do the search results on the company’s intranet include courses, wikis, forums, and other products created or supported by the learning organization?

Pulling up a chair

Ultimately, like the brand of a company, the learning organization’s brand can be a key asset and enabler, not just for individual learning organizations, but for the learning profession as a whole. Everyone in the learning profession has heard discussion of how to get “a seat at the table”. We want to be – and we know we can be – valued business professionals who make a difference in our companies.

But to be seen this way, and to have the opportunity for more involvement, we need to add value and just as importantly we need to be perceived to be adding value. By thinking about and actively managing our own brand image, we need to rely less on the hope that we’ll be offered a seat. Instead, we can get much closer to pulling up our own chair when it comes to being seen as an important part of the core business.

Fredrickson Communications eZine - December 2009

by Site Admin,

In this edition of the Fredrickson eZine . . .

Happy Holidays

by Lola Fredrickson, CEO
Fredrickson Communications

On behalf of the owners and all the employees of Fredrickson Communications, we wish everyone a happy and safe holiday season and a prosperous beginning to 2010.

I know most of us will remember 2009 for the very difficult economic times that we’ve all had to face. It hasn’t been easy, but we’ve managed to pull through and I hope that we all find better times ahead in 2010.

I hope everyone can take the time to be with family and friends and to enjoy the season. Happy holidays!

Lola Fredrickson

How Do We Improve the Learner Experience of LMS’s?

by John Wooden, Director of Usability Services
Fredrickson Communications

Corporate learning management systems do not appear very often in usability research literature. True, there’s been plenty of discussion of the issues that often arise during LMS implementations, especially as administrators try to integrate various types of content. (Fredrickson technical architect Monique Benson addressed these issues in an earlier article on our site.) But the end user (the learner rather than the administrator or instructor) experience of these applications has not been a high priority. This needs to change if the promise of these tools is ever to be realized. I don’t mean to suggest that vendors haven’t made improvements over the years, but more remains to be done.

As a client of ours explained in an email to me, “LMS’s are designed for learning professionals and not for mainstream learners. There’s a real need to make them more usable for the learners.” I agree completely. So what can be done about it?

Read more. . .

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

How Do We Improve the Learner Experience of LMS’s?

by John Wooden, UX Director

Corporate learning management systems do not appear very often in usability research literature. True, there’s been plenty of discussion of the issues that often arise during LMS implementations, especially as administrators try to integrate various types of content. (Fredrickson technical architect Monique Benson addressed these issues in an earlier article on our site.) But the end user (the learner rather than the administrator or instructor) experience of these applications has not been a high priority. This needs to change if the promise of these tools is ever to be realized. I don’t mean to suggest that vendors haven’t made improvements over the years, but more remains to be done.

As a client of ours explained in an email to me, “LMS’s are designed for learning professionals and not for mainstream learners. There’s a real need to make them more usable for the learners.” I agree completely. So what can be done about it?

The issue of LMS usability is complex, and there’s no quick five-step answer to the question in this article’s title. To mention just one knotty problem, it’s long been a concern that LMS’s are only searchable from within the system itself. This means that you can’t do a general intranet search and get LMS content in your search results. There are potential ways to address this, but they are challenging.

Still, it would be an important step in the right direction if more companies using these systems routinely conducted usability tests of their LMS’s with representative learners. When I’ve had the opportunity to do testing of major LMS’s at client sites, we’ve always gained insight into how people interact with these tools and how to improve the experience of using them.

Let’s take one LMS test project as an example of what you could do and what you might learn in your own organization.

Usability test questions

We began by drafting a usability test plan and recruiting about 15 testers in the client company to represent two user groups: (1) learners and (2) those who played a role in helping learners select and register for courses. Those in the latter group were often administrative assistants, rather than LMS administrators or instructors. In our test plan, we listed 10 key questions that we wanted to answer:

  1. How do testers prefer to find courses – browsing the catalog, or using the search feature?
  2. When do they browse versus search?
  3. What tester performance issues arise when using the catalog and when using search?
  4. Are the training content categories helpful and clear?
  5. Do learners use the advanced search feature? What performance and preference issues arise? For example, do they know that they need to do an advanced search to find courses by location and type?
  6. Are the search categories provided in advanced search helpful and clear?
  7. Can users easily begin a new advanced search?
  8. Are the icons meaningful to testers?
  9. Can users easily register for a course?
  10. Is it clear to testers how to check their transcript and change preferences?

We then drafted task scenarios designed to help us answer these questions. What follows is a high-level summary of the key issues the test uncovered.

Browse or search?

Even though the LMS user interface (UI) was designed for browsing, most testers ended up using the search feature. The learner’s starting point in the system was a topic index page that presented about 10 topic headings — such as Business Systems & Applications, Human Resources, Project Management, and Personal Effectiveness — in a very large bold font. When the learner clicked one of these headings, they would see the first page of an often lengthy course list. They could then click the course they wanted, read the description, and proceed to register for it.

Two main problems were evident when testers tried to find a course by browsing: (1) It wasn’t always obvious which category to check for a particular course. Although some headings were clear, others were ambiguous, such as Personal Effectiveness. (2) It took too much time to review the long lists of courses in some categories – for example, Project Management had 80 courses. So although the LMS interface invited the testers to browse by presenting big, bold category headings on the first page, this activity became too time consuming and difficult and was abandoned.

What testers did instead was to use the search, though this feature was much less prominent on the page than the topic categories. Unfortunately, the testers found that Search was not the easy alternative they were hoping for, because as some of them eventually realized to their frustration, the searches were scoped by default. In other words, they were limited to searching within just one topic category at a time. This meant that if they inadvertently looked for a course in the wrong category and saw no results, they might think that the course did not exist. (In a usability test of another LMS, we found the search logic was set at “Starts with” by default, leading to disastrous results for most testers who used it. Few testers knew precisely what the course names started with – “MS Project?” Sorry, no results. The user had to enter “Microsoft Project” to find anything.)

Advanced search

The advanced search feature of the LMS being tested was more than just a supplement for savvy users — it was a critical feature. It presented a number of additional search options that many learners would be interested in. For example, they could look for a classroom course in a particular physical location (the client company is based in many cities in different countries) or find learning material in a specific format (such as eLearning, CD, book, etc.). But in most cases the testers did not even see the Advanced Search option, and if they did, they were not typically enthusiastic about using it. (Most users avoid Advanced Search on websites. Just calling something “Advanced” scares most of us away.)

Training lingo

Another problem was evident from observing and listening to the testers – the use of training lingo in field labels and menus, such as “ILT,” “Training Activity Type,” “Roster,” and so on. Most of the testers didn’t understand these terms and would sometimes make an incorrect choice because of it.

Course registration

Testers stumbled through the process of course registration and were often unsure if they had actually registered. After the trouble they experienced finding a course, this compounded their frustration. The key concerns the testers expressed were uncertainty about where they were in the registration process, how many steps they needed to complete, and whether they had successfully registered. Just as with the shopping cart and checkout experiences on retail websites, the course registration process in LMS’s requires clear signposting and lots of feedback to let learners know where they are, what they need to do next, and whether they have been successful.

What you can do

Based on this high-level summary, it’s obvious there was a lot of room for improvement in the usability of this client’s LMS – in the labeling and presentation of search and navigation options, in the search logic, in the registration process, and in terminology.

And as we recommended to our client’s LMS vendor, we believe one of the most useful features an LMS can provide is faceted search (also known as guided navigation). For users looking for learning materials, a well-implemented faceted search feature would be a huge help. (The vendor said they were working on it.)

But this leads to a key question: “Even if I do test the usability of our LMS, what can I do about the issues the test uncovers? It’s a vendor product. We don’t want to do significant customization because that will complicate future upgrades.”

If you are working with a vendor system, as opposed to an open-source product, it’s true that not all of the power is in your hands. However, here are few options to consider:

  • If a problem is within your scope, as a client, to configure, then you can address it in house. This is usually the case with labels and terms that learners have trouble understanding. And terminology is often a huge part of usability.
  • Invite vendor reps to observe usability testing – there is no better catalyst for change. This is preferable to simply reporting the issues to the vendor, because when vendors see real users experience real issues, the impact is much greater. It’s hard for them to try to deny or explain away this evidence. (If the vendor can’t be there in person, send them video clips.) In addition, make sure your vendor has a user forum and make your voice heard.
  • With issues that you know the vendor is unlikely to address any time soon because of their complexity (such as issues pertaining to search functionality), you can at least create quick reference cards or other guidelines for instructors and learners. If nothing else, by uncovering the specific tasks or process steps that learners struggle with, you can create more targeted and helpful user assistance material.
  • Finally, don’t let LMS vendors off the hook – let them know what is frustrating end users, and slowing their productivity. Tell them that the learner experience has to become more of a priority.

Fredrickson Communications eZine - October 2009

by Site Admin,

In this edition of the Fredrickson eZine . . .

The Fredcomm Blog

by Lola Fredrickson, CEO
Fredrickson Communications

We have a new addition to the Fredrickson Communications website—the Fredcomm Blog.

At Fredrickson, we make sharing information and fostering connections between learning, communications, and usability professionals part of our business. We do so through our seminars, the organizations we sponsor like the Fredrickson Roundtable for Learning Leaders and Intersect, and now through our blog.

We decided a blog would be a great addition to our website because it enables us to publish more frequently and with greater ease.

Like our eZine, our blog isn’t dedicated to just one area of our business. Our Fredrickson contributors will all be weighing in with perspectives from all our practice areas. John Wooden will be there with views on usability and technology, Robin Lucas will be writing about the business of learning, and we’ll have contributions from Joyce Lasecke and other Fredcommers. We hope you’ll visit and return often.

2010: The Year Social Learning Goes Mainstream?

by J. Hruby, Director of Marketing
Fredrickson Communications

One of the most talked-about trends in corporate learning over the past few years has been social learning—using social media technologies like blogs, discussion forums, and wikis to enhance learning.

Of course, talking about something and actually doing it are always two different things. Many corporate learning groups have found resistance both on the acceptance and implementation sides of the social learning equation.

First let’s consider the acceptance of social media. Some companies have been very slow in allowing or encouraging the use of social media tools because some have branded them as “time wasters” and “only of interest to tech-addicted Gen Y’ers.”

I can’t help but feel déjà vu all over again when I hear statements like this.

Read more . . .

Seminar: Surefire Ways to Manage the Review Process for Online Learning

Part of the 2009
ASTD-TCC Regional Conference & Expo – November 12, 2009

Fredrickson’s Robin Lucas and J. Hruby will deliver a seminar about conducting more efficient and effective reviews of online content. This seminar is part of the Technology and eLearning track at the 2009 ASTD-TCC Regional Conference & Expo.
Content reviews are an important step in the development process for online learning. Even in a world of rapid development tools, most online learning courses still require one or more formal reviews. Stakeholders, partners, and others usually want a say in the course’s design and content.

The question for every learning manager and content developer is this: How do you manage reviews so that they are effective, targeted, and consume the least amount of time? This seminar is designed to answer this question with a proven process, techniques, and advice for everyone involved in online learning development.

Read More
For more advice on conducting effective online reviews, see
Robin Lucas’s article on the Fredrickson Communications website:

Seminar Date, Time, & Location
November 12, 2009
3:00 to 4:15 p.m.
RiverCentre, St. Paul, MN

Registration
For more information and to register for the 2009 ASTD-TCC Regional Conference & Expo, visit the
ASTD-TCC’s website.

Fredrickson Communications Shares in Prestigious Tekne Award

by J. Hruby, Director of Marketing
Fredrickson Communications

A project for which Fredrickson provided usability and user-centered design support has won a Tekne Award. Fredrickson’s John Wooden provided a series of usability and user-centered design classes to key members of the CareerOneStop redesign team in order to help them during the redesign process.

Read more . . .

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

CareerOneStop Wins Prestigious Tekne Award

by J Hruby, Vice President, Sales & Marketing

A project for which Fredrickson provided usability and user-centered design support has won a Tekne Award. Fredrickson’s John Wooden provided a series of usability and user-centered design classes to key members of the CareerOneStop redesign team to help them during the redesign process.

From the Techne Awards program:
Under the guidance of Program Director Mike Ellsworth, the state of Minnesota’s Department of Employment & Economic Development and Minnesota State Colleges & Universities collaborated to redevelop CareerOneStop.org, integrating a new web content management system, a web site search engine and a new information architecture and taxonomy. CareerOneStop.org, a U.S. Department of Labor-sponsored website, offers career resources and workforce information to various audiences including job seekers, workforce counselors, employers, and economic developers. The site is now the premier source for career information, with a greater breadth of data than any other public or private site. It offers much of its information and services for integration into other web sites through Web Services. CareerOneStop.org serves more than 24 million unique visitors each year and its new look has garnered mentions by ABC News as well as financial commentator, Suze Orman.

The Star Tribune published a complete list of the 2009 Techne Award winners. Fredrickson is very proud to have contributed to this project’s success.

Fredrickson Communications eZine - August 2009

by Site Admin,

In this edition of the Fredrickson eZine . . .

Surefire Ways to Manage Reviews of Online Content

by Robin Lucas, Director of Project Management
Fredrickson Communications

Even in a world of wikis and blogs, most of the content that we write for a wide business audience requires some form of review. Stakeholders, partners, and associates usually want a say in what is published. So the question is: How do you manage your reviews effectively so that you have the quickest time to publication?

Imagine that you are publishing and deploying your new content tomorrow. Today, the Vice President of Marketing sends you a list of changes that you asked for over a week ago. Two hours later, you get an email from the Brand Manager, announcing changes to the way your company refers to the products described in your content. To complicate matters, you find several contradictions — not only between what the VP and the Brand Manager says, but also between this new feedback and the feedback you received from a previous review. The result is that, with only one day left on your schedule, content that you thought was approved needs to change, and you need to sort out and resolve some conflicting last-minute feedback.

I’ve been in these shoes before. Through experience, I’ve discovered how to avoid situations like this.

Read more . . .

How’s that LMS working for you?

by J. Hruby, Director of Marketing
Fredrickson Communications

If your answer to this question is along the lines of “not very well” or “not as well as we’d like”, or even, “I don’t want to talk about it,” then read on.

I recently spoke to a prospective client who is the Chief Learning Officer of a large regional health care provider. This health care company owns several hospitals and has thousands of employees. He called us because he couldn’t stand the pain that their LMS was inflicting on his learning organization any more and he was wondering if anything could be done about it. Through our conversation the usual sad LMS story emerged.

Everything was fine for a while after implementation, but as the implementation team drifted away, their LMS problems started slowly but steadily piling up. Their LMS administrator who was part of the team during implementation left the organization. The new administrators found it difficult to integrate new courseware, users complained about how difficult it was to find and register for classes, learning administrators were having trouble setting up new classes, and so on.

Their response to all of these problems was unfortunately fairly typical as well: they retreated into a kind of LMS bunker by limiting their use of the system to just the few functions that they knew they could administer. Essentially, they had retreated so far into the LMS bunker that they now have two full-time administrators who manually key class attendance data into the system.

This wasn’t exactly what this CLO envisioned they’d be getting when they bought a top-tier LMS. His first question was, “Is it the LMS or is it us?”

I had to give it to him straight: It’s probably you. By that I mean that knowing what I know about their LMS, I’m sure it has all the technical capabilities they need and quite a few they don’t.

As I told him, the care and feeding of an LMS is a technical endeavor. While some basic administration tasks can be done by almost anyone, when the going gets tough you really need to turn to technical professionals. And once the LMS techies have things up and running, you need to implement sound LMS administration processes and procedures to keep them that way.

If you feel like you’re not getting enough out of your LMS, or if you feel like you’re trapped in the LMS bunker, get in touch. We can help with LMS consulting, courseware troubleshooting, LMS administration processes and procedures, usability testing, LMS portals, and a variety of other services. Your LMS should be working for you. If it’s not, give us a call.

Coming Attractions for Learning & Development Professionals

by J. Hruby, Director of Marketing
Fredrickson Communications

September presents two terrific opportunities for learning and development professionals in the Twin Cities area.

Elliott Masie – Futurist and all-around learning guru Elliott Masie will be in town on September 25 as the featured speaker for ASTD Twin Cities Chapter’s monthly meeting. I have attended a Masie seminar in the past and he has a knowledgeable and entertaining style that makes this a “do-not-miss” event.

For more information and to sign up, see the ASTD-TCC’s website.

Members of PACT can attend this session free of charge, so if you’ve been considering joining this excellent Twin Cities-based organization, now would be a great time to do so.

Bill George – Former Chairman and CEO of Medtronic Bill George has just released a new book called Seven Lessons for Leading in Crisis. We believe that many of the members of our Fredrickson Roundtable for Learning Leaders, as well as others, will be interested in attending this free business event. It’s called Summit on Leading in a Crisis, and it will be held on September 17 from 5:30 to 7:30 p.m. at the Ted Mann Theater at the University of Minnesota.

George will moderate a panel with CNN commentator David Gergen; Anne Mulcahy, chair and former CEO of Xerox Corporation; John Donahoe, president and CEO of eBay; and Marilyn Carlson Nelson, chair and former CEO of Carlson Companies.

For more information, see the Minneapolis / St. Paul Business Journal.

Fredrickson Communications eZine - July 2009

by Site Admin,

In this edition of the Fredrickson eZine . . .

Adjusting to The Big Shift

by John Wooden
Director of Usability Services
Fredrickson Communications

John Hagel, John Seely Brown, and Lang Davison, all leaders of the Deloitte Center for the Edge, recently published a provocative study called “The Big Shift: Measuring the Forces of Deep Change.” Once I read it, I wanted everyone attending Fredrickson’s July 16 Learning Leadership Summit to read it in preparation. This is an authoritative, data-driven argument to stop managing for a 20th Century environment when we are living in the 21st.

The authors published a summary version of the report in the July-August edition of the Harvard Business Review. An even shorter summary is available in the authors’ blog on Harvard Business.org. The report presents some unsettling findings:

  • The Return on Assets (ROA) for US firms has steadily fallen to almost one-quarter of 1965 levels. (ROA measures the return a company generates from its total assets.)
  • The ROA performance gap between winners and losers has increased over time, with the winners barely maintaining previous performance levels while the losers experience rapid deterioration in performance.
  • The “topple rate” at which big companies lose their leadership positions has more than doubled, suggesting that “winners” have increasingly precarious positions.
  • US competitive intensity has more than doubled in the last 40 years.
  • While the performance of US firms is deteriorating as measured by ROA, the benefits of productivity improvements appear to be captured in part by creative talent, which is experiencing greater growth in total compensation. Customers also appear to be gaining and using power as reflected in increasing customer disloyalty. Read more . . .

Usability and User-Centered Design at 3M: An Interview with Kathryn Bohlke

by John Wooden
Director of Usability Services
Fredrickson Communications

Kathryn Bohlke is Manager of 3M IT’s User Information and Usability (UIU) Group. In this role, she has been instrumental in building a usability practice area, creating a comprehensive body of user interface design standards, and ensuring that applications and internal web sites at 3M are subjected to thorough UI design reviews. Kathy is also a Certified Usability Analyst through HFI and a graduate of the University of Minnesota’s Management of Technology Masters program.

John Wooden, Director of Usability Services at Fredrickson Communications, spoke with Kathy about building and managing the UIU Group at 3M. Read more . . .

Learning Leadership Summit 2009 – recap and looking ahead

by J. Hruby
Marketing Director
Fredrickson Communications

Fredrickson Communications held the third-annual Learning Leadership Summit on July 16 at Travelers in St. Paul. We founded this event to provide a gathering specifically for the leaders of corporate learning organization throughout the Midwest.

This year we had a terrific turnout with more than 70 learning leaders from companies all over the Twin Cities in attendance. The theme of this year’s Summit was “Danger! Economic recovery ahead,” and our discussion centered on the role of the learning organization in the coming economic recovery. It’s impossible to capture all of the thoughtful and interesting discussion points that came from this event, but I’ll summarize a few points that I thought were especially interesting. Read more . . .

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

References

(1) Dave Wilkins of Learn.com has provided links in his blog to several good examples of social learning initiatives, which clearly demonstrate the ways in which social media can amplify knowledge flows and create value for the organization. Wilkins also notes that the initiatives he provides as examples were not led by training departments.

Adjusting to The Big Shift

by John Wooden, UX Director

John Hagel, John Seely Brown, and Lang Davison, all leaders of the Deloitte Center for the Edge, recently published a provocative study called “The Big Shift: Measuring the Forces of Deep Change.” Once I read it, I wanted everyone attending Fredrickson’s July 16 Learning Leadership Summit to read it in preparation. This is an authoritative, data-driven argument to stop managing for a 20th-century environment when we are living in the 21st.

The authors published a summary version of the report in the July-August edition of the Harvard Business Review. An even shorter summary is available in the authors’ blog on Harvard Business.org. The report presents some unsettling findings:

  • The return on assets (ROA) for US firms has steadily fallen to almost one-quarter of 1965 levels. (ROA measures the return a company generates from its total assets.)
  • The ROA performance gap between winners and losers has increased over time, with the winners barely maintaining previous performance levels while the losers experience rapid deterioration in performance.
  • The “topple rate” at which big companies lose their leadership positions has more than doubled, suggesting that “winners” have increasingly precarious positions.
  • US competitive intensity has more than doubled in the last 40 years.
  • While the performance of US firms is deteriorating as measured by ROA, the benefits of productivity improvements appear to be captured in part by creative talent, which is experiencing greater growth in total compensation. Customers also appear to be gaining and using power as reflected in increasing customer disloyalty.

In their June 19 blog entry, Seely Brown, Hagel, and Davison wrote that they began their study two years ago as an attempt “to get our heads around the long-term transformation we saw happening to the global business environment as a result of digital technology and, to a lesser extent, public policy changes [such as free-trade agreements]. We later came to call this transformation the Big Shift.” What they describe is a “new reality” of “constant disruption.” The old cycle of disruption – return to equilibrium – new disruption may well be a thing of the past. Why?

“Today’s core technologies—computing, storage, and bandwidth—are not stabilizing. They continue to evolve at an exponential rate. And because the underlying technologies don’t stabilize, the social and business practices that coalesce into our new digital infrastructure aren’t stabilizing either. Businesses and, more broadly, social, educational, and economic institutions, are left racing to catch up with the steadily improving performance of the foundational technologies. For example, almost forty years after the invention of the microprocessor, we are only now beginning to reconfigure the digital technology infrastructure for delivery of yet another dramatic leap in computing power under the rubric of utility or cloud computing. This leap will soon be followed by another, then another.” (The New Reality: Constant Disruption. The Big Shift blog. January 17, 2009.)

Although many writers have talked about the varied effects on business of the rapidly growing digital infrastructure, The Big Shift attempts to quantify what is happening. The authors set about to track “25 metrics in nine categories across three sets of main indicators: Foundations, which set the stage for major change; Flows of resources, such as knowledge, which allow businesses to enhance productivity; and Impacts, which help gauge progress at an economy-wide level. Together these indicators represent phases of transformation in the Big Shift taking place in the global business environment.” (Measuring The Big Shift.)

“Taken as a whole, the findings portray a U.S. corporate sector in which long-term forces of change are undercutting normal sources of economic value. ‘Normal’ may in fact be a thing of the past: even after the economy resumes growing, companies’ returns will remain under pressure.

To respond to this performance challenge, U.S. companies will need to let go of industrial-era organizational structures (and the reporting relationships, incentive systems, and managerial processes that go with them) and operational practices in favor of the new institutional architectures and business practices needed to create and capture economic value in the era of the Big Shift.

Companies must move beyond their fixation on getting bigger and more cost-effective to make the institutional innovations necessary to accelerate performance improvement as they add participants to their ecosystems, expanding learning and innovation in collaboration curves and creation spaces. Companies must move, in other words, from scalable efficiency to scalable learning and performance [emphasis added]. Only then will they make the most of our new era’s fast-moving digital infrastructure.” (Ibid.)

The relevance to organizational learning

The Big Shift is obviously pertinent to all functional areas of business, but the relevance to those in organizational learning roles is especially clear.

Seely Brown, Hagel, and Davison emphasize that it is no longer sufficient for organizations simply to manage their existing stocks of knowledge. What is becoming more critical is the ability to facilitate flows of knowledge throughout and between organizations. They cite the example of SAP’s Developers’ Network, which includes 1.5 million participants and extends beyond the boundaries of the company. Posted questions are answered in 17 minutes on average, and 85% of the questions posted so far have been categorized as “resolved.”

As the authors note:

“To succeed now, companies (and individuals) have to continually refresh what they know by participating in relevant “flows” of new knowledge. Tapping into and harnessing the flows of knowledge, especially flows generated by the creation of new knowledge, increasingly define one’s competitive edge, personally and professionally.”

The authors underscore the role that social media must play internally in organizations as “amplifiers” of knowledge flows, a point I have been making in presentations over the past couple of years. More organizations are beginning to understand this and to use social media to support both formal and informal learning, but we are still in the very early stages. What is disappointing is that so far many of the initiatives to support knowledge flow in organizations have not been led by learning departments. In many cases, these initiatives have been led by IT or Marketing (1).

So what needs to happen? Organizational learning professionals need to read and share the findings and ideas presented in The Big Shift, to acknowledge the “new reality” that Seely Brown, Hagel, and Davison describe, and to position themselves not only as “trainers” but as facilitators and coordinators of dynamic knowledge flows in their enterprises. This means partnering with IT, Communications, and HR, and thinking beyond the course, module, and other formal learning modes to adopt more amplifiers of informal learning—social/knowledge networking, forums, blogs, wikis, and so on.

Economic recovery will take place and the business cycle will continue, but based on the data presented in The Big Shift, big challenges will remain unless business leaders begin to recognize that the old 20th-century managerial tools and techniques are not all useful anymore. For their part, learning leaders need to shake up their toolbox too.

Robin Lucas

Robin Lucas is Vice President of Fredrickson Learning. She has been in the field of training, documentation, and communications for over 20 years. She has managed hundreds of projects of all sizes.

Robin has also served as a “program manager” for several Fredrickson clients, defining project processes and standards and ensuring that team members follow them. She has worked in the areas of financial services, telecommunications, retail, and Enterprise Resource Management systems. Robin holds a BA in English from Rice University.

Tips and techniques for non-responsive reviewers

Many content developers struggle with reviewers who are too busy, distracted, or uninterested in providing feedback. Following are recommendations for managing that relationship effectively.

  • Set expectations and get time commitments from reviewers and the people to whom they report.
  • Highlight the project deadline up-front.
  • Ask the reviewer to set a date for response; negotiate the date if necessary to leave yourself enough time to complete your part of the work.
  • Offer to point out the specific section(s) where the reviewer’s feedback or input is most important, so the reviewer does not feel overwhelmed by the amount of content to review.
  • Send reminders about promises to respond; ask whether any roadblocks have appeared.
  • Play the “boss” card sparingly and never threaten to go over the reviewer’s head. Instead, ask how you can help secure the reviewer’s time by negotiating with the boss instead.
  • When a deadline passes, do what you say you will do, even if that means publishing without reviewer input.

Surefire Ways to Manage Reviews of Online Content

by Robin Lucas, Vice President Fredrickson Learning

Even in a world of wikis and blogs, most of the content that we write for a wide business audience requires some form of review. Stakeholders, partners, and associates usually want a say in what is published. So the question is: How do you manage your reviews effectively so that you have the quickest time to publication?

Imagine that you are publishing and deploying your new content tomorrow. Today, the Vice President of Marketing sends you a list of changes that you asked for over a week ago. Two hours later, you get an email from the Brand Manager, announcing changes to the way your company refers to the products described in your content. To complicate matters, you find several contradictions — not only between what the VP and the Brand Manager says, but also between this new feedback and the feedback you received from a previous review. The result is that, with only one day left on your schedule, content that you thought was approved needs to change, and you need to sort out and resolve some conflicting last-minute feedback.

I’ve been in these shoes before. Through experience, I’ve discovered how to avoid situations like this. It comes down to three magic words:

  • People
  • Process
  • Communication

Mastering the tips in this article will help you obtain valuable feedback from reviewers while sticking to your schedule and maintaining your team’s sanity.

People

When you start a project, get answers to these questions:

  • Who needs to be involved and when?
  • What is this person’s role as a reviewer? For example, is he responsible for the technical accuracy of a specific topic? Is she responsible for ensuring adherence to brand standards?
  • Who has the final say?

Be sure to ask these questions not only for the members on the team, but for all interested parties. This helps prevent the emergence of last-minute, surprise reviewers.

Process

Equally important as the people involved is the process you follow. If you don’t have a process, now is the time to define one. Take these steps to establish your process:

  • Define the number of and objectives for each review cycle.
  • Determine who needs to contribute feedback in each cycle.
  • Designate one person to be the review coordinator — the person who coordinates and compiles all feedback.
  • Specify the type of feedback you want from the reviewers.
  • Provide a way for people to communicate their comments and feedback easily.
  • Keep a history of comments and changes to aid in resolving issues.

The success of this process depends on two key elements. The first is the role of the review coordinator. When assigning the review coordinator role, look for someone with strong organizational strengths who knows how to get people’s attention and facilitate agreement. The review coordinator’s responsibilities include:

  • Compiling all comments.
  • Settling differences and/or gaining agreement on what not to incorporate.
  • Securing sign-off.

Once you designate the role, make sure the team understands that this one person has authority for coordinating all review cycles and all feedback within the review cycle.

The second key is to make it as easy as possible for the team to review the content and make comments. Some content development tools have built-in review features — such as the Track Changes feature in Microsoft Word or the online collaboration capabilities of Google Docs. But for some other formats of publication, especially for the web, reviewing electronically is not as easy, and keeping a history of the changes and corrections can be impossible to manage. (For information on a tool that can help make this easier, see the sidebar Our tool for managing reviews of online content.)

Communication

Finally, as with any aspect of a project, communication among team members and with your project stakeholders is a key responsibility and integral to managing the review process effectively. Keep the following list of tips in mind as you enter the review phase of your project:

  • Send a review cover sheet with your materials that describes the “rules of the review.”
  • Set expectations for how to give feedback.
  • Develop techniques for working with reviewers. (See sidebar Tips and techniques for non-responsive reviewers.)
  • Keep stakeholders informed of reviewer’s feedback. Most importantly, communicate how the reviewer’s feedback may have changed the important messages that the stakeholders want covered in the content.

At the beginning of your project, communicate the following about the review process:

  • The cost of reviews – Remind everyone that time is money and point out the consequences of waiting to provide feedback until you are near the end of development. When feedback comes in late in the process, it can increase both your time to delivery and the overall project cost.
  • The schedule and objectives of each review cycle – If this is a first review, you may ask reviewers to confirm that you covered the content adequately and that nothing is missing. In subsequent reviews, you may only be asking reviewers to confirm that you made the requested changes. In addition to these up-front communications, use the following guidelines for framing your communications in each review cycle:






    What to communicate How to do it
    The reviewer’s role
    • State what feedback the reviewer is responsible for. For example, do you want them to verify the accuracy of the content or to provide input on the content?
    • Indicate what not to review for. For example, “Final edits will be made after all comments are supplied. Don’t spend your time with spelling, punctuation, grammar, etc.”
    The deadline for review
    • Give reviewers enough time, but not too much time, to provide feedback. You may want to include a weekend in the review period since reviewers often take this type of work home.
    • Follow up with those who don’t respond, but be clear that you have to move forward.
    The objective of the review
    • Explain the differences between changes and corrections and let people know when they should make corrections only.
    • Be specific about the feedback you need, if appropriate. For example, “Please look at section 5 to confirm that all steps of the process are included.”
    The format of comments, feedback, and corrections
    • Explain how to provide feedback — can people send you handwritten changes, or should they send material electronically only?
    • Provide examples of comments that are specific and that result in no doubt about what change or correction is needed.
    • If you are providing a tool to track comments, changes, and decisions, explain how to use it.

By following these guidelines, you should find that getting your content to publication not only becomes easier but results in less confusion and more agreement on what is published.

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Usability and User-Centered Design at 3M:  An Interview with Kathryn Bohlke

by John Wooden, UX Director

Kathryn Bohlke is Manager of 3M IT’s User Information and Usability (UIU) Group. In this role, she has been instrumental in building a usability practice area, creating a comprehensive body of user interface design standards, and ensuring that applications and internal web sites at 3M are subjected to thorough UI design reviews. Kathy is also a Certified Usability Analyst through HFI and a graduate of the University of Minnesota’s Management of Technology Master’s program.

John Wooden, Director of Usability Services at Fredrickson Communications, spoke with Kathy about building and managing the UIU Group at 3M.

John:
During your time as 3M’s UIU Manager, has usability – as both a practice and an objective – become more widely accepted by 3M IT’s internal clients? Or do you still need to do a lot educating about the benefits of usability?

Kathy:
It really is a mixture regarding the acceptance of usability testing, but the awareness and use have increased for what I call “real” usability feedback. By that I mean conducting formal usability test sessions with actual end users as opposed to providing walk-throughs or training sessions and calling those usability reviews, or doing user acceptance testing with project team members and calling that usability testing.

However, the awareness is broader than the actual practice, because even though most agree that the solution should be usable, it is not always agreed on how to validate usability. Some stakeholders have different interpretations as to what constitutes usability testing. But overall the business side has been more open and accepting of usability testing than IT – maybe because they are closer to end users, market needs, and business value. IT has a history of building something, throwing it over the fence, and expecting users to work with it. Now users are pushing back – there is a greater expectation of usability – and this includes executive-level users as well. The increase in self-service applications may have led to this expectation of greater usability and productivity.

The biggest value add from usability within IT is having it part of our IT development methodology. This was partly a result of a Six Sigma project to identify the Cost of Poor Quality. Usability is considered a quality indicator, and so usability evaluation is a recurring activity throughout the overall IT methodology. To support this, various usability artifacts are provided, such as a Usability Testing Checklist, Usability Test Plan, Usability Test Script, and Usability Analysis Report. Usability evaluation is also part of multiple gate reviews. The project team holds a gate review at the end of each phase in the methodology to report progress and evaluate status to determine if the project should go into the next phase or not. Usability was not one of the attributes initially used to evaluate the status of the project, and it is now. So, if there is poor usability, the project has to decide whether or not to fix the issues, or to accept the risk.

So there’s definitely been progress, but we will never stop educating. All of our internal consulting engagements are an opportunity for communicating and educating about usability. We also give presentations at department team meetings, at Project Leader forums, and so on. There is a module for Usability in the IT Methodology’s training sessions, and I am referenced for consulting and services. Word-of-mouth has been good for educating and marketing as well. Our clients help spread the word too.

John:
You mentioned that not everyone agrees about how to validate usability. What are the most common misconceptions or misunderstandings about usability that you encounter when working with clients?

Kathy:
One point I sometimes have to emphasize is that I did not invent usability best practices – that there is a body of research supporting this field. Sometimes I have to validate usability best practices or guidelines by finding substantiating data from Jakob Nielsen or another published source. At times even that does not help because the person questioning my recommendation does not know who Jakob Nielsen is. But awareness has improved thanks to the internet. I can Google quickly to validate most things I recommend.

Another issue I have to explain is that user acceptance testing and training are not the same as usability testing. For example, I will meet with a project team and be told that they already did usability testing. I ask them to explain what method they used, which they do. I will then tell them that I call what they did training or user acceptance testing, not usability testing. When I explain real usability testing, I sometimes get a shocked reaction: “You mean, you aren’t going to teach them anything first?”

It can also be a problem when everyone on a project team thinks they are a usability expert. The truth is that you never know your end users as well as you think you do — so test with them. But even after testing, some people feel they still know better than what the user data indicates. One of the joys of testing is that you continue to learn. There is always another surprise result to react to and resolve.

Another important point is that user interface design is different from usability evaluation. Often I am asked to help a team with “usability,” and then I discover that they actually have a UI issue to resolve. The partnership between UI design and usability is tight, and some people don’t know how to distinguish between the two. What I need to do is make sure we are talking about the same thing before we get too far along into the discussion.

John:
How would you distinguish between the UI Designer role and Usability Analyst role?

Kathy:
I think UI design and usability analysis are different arts. And even within UI design, I make a distinction between transactional user interface design and graphical UI design. Transactional UI is getting the user from point A to point B efficiently and accurately — with little thinking. Graphical UI is making that path more visually interesting to cause the user to browse and contemplate. One type can blend in with the other, but to
have one person be excellent at both types of design is unusual. An analogy would be a writer who is equally good at documentation, training development, and copywriting.
I work with a variety of UI designer skill sets. The ones with a lot of programming experience design differently from the ones with more technical writing experience, yet both are UI designers — they just approach it differently. That is why we have documented UI standards and do usability testing. UI standards make all the designers agree on how a certain function or control is going to look and act. Then the usability testing makes the designers question their personal preferences and the documented standards and instead react to the user. Testing keeps designers honest.

Though having an advanced human factors degree, programming experience,
visual design experience, etc. can help people be better at the role, I think the true test at an interview for a UI Designer is to give them a piece of paper and pencil, and tell them to draw the landing page of an application that allows the user to order breakfast and lunch for a specific date and time. They are allowed to ask me five questions (hopefully used to clarify the requirements), and they need to be done in 30 minutes or less. That will separate the real designers from the wannabes. This exercise will also give you a clear indication if the person is a transactional or graphical UI designer. A programmer type may provide much more functionality than you want, a visual design type may want colored pencils, and a human factors type may want to know more about the user
personas.

The person I would hire will have a solid sketch with clear, easily visible controls that are easy to understand and that are aligned well, with everything clearly labeled and firmly anchored so I know where I am and what I can do next.

For usability testing, much the same holds true. The various educational experiences can help, but some important qualifications can’t necessarily be learned from a class. All facilitators have to be excellent listeners who don’t try to solve the problem before the user is done speaking, who don’t fixate on personal preferences, who don’t make assumptions based on the person’s appearance or speech pattern, who don’t over analyze data, etc. They need to stay focused and on track while looking under what they hear to figure out the root issues. For example, testers who can’t find the tabs at the top of the page may not mean you should make the tabs bigger, but instead may mean you need to change the background color so the tester can see the tabs better.

A really important qualification for both a UI Designer and Usability Specialist is that they are not “me” oriented. “I think” and “I like” are red flags to me. It is about the user — within the confines of tested standards and conventions.

John:
Usability and user interface design are now often included under the broader term “user experience” or UX. Different functional areas in companies have claimed UX as properly belonging in their area – for example, Marketing, IT, Communications. Do you think it belongs best in one particular functional area?

Kathy:
I think it belongs in an area that is not beholden to any other area. It
needs upper-level sponsorship and support so that it is required to be part of
each functional area’s DNA. My group is in IT, within a quality group. So the
group crosses over all areas of IT. The problem with being in IT is that it
can be perceived as an “IT thing” — not always good marketing. It would be
nice to have satellite groups or reps in multiple areas that all report in
to a central organization.

So when the marketing department, for example, has specific UI design challenges, those challenges could be addressed by their own UI/usability rep. But that rep would ultimately be beholden to the central organization. Organizations all have underlying political and social structures that have to be navigated, so the best fit needs to take those things into consideration as well.

John:
Based on lessons you’ve learned, what advice would you have for someone
trying to build a consistent usability practice in their organization?

Kathy:
First, help upper management “get it.” Usability needs a believer beyond the ones
who do it for a job. Second, track and market results. The down side for my
service is not having much follow up. Joining a project to do a test, provide the results and recommendations, and then going away is not effective. I have found that project teams still need assistance to create the correct UI based on the usability findings. We now follow up with the project to see what they have done and ask to be involved in meetings that prioritize the recommendations and decide how to resolve. We are trying to get more involvement with the project after the testing. This involvement will make it less difficult to spend the time creating a brag book or market via presentations. Third, provide templates, guidelines, standards, and stories to help people learn about it and try it on their own.

John:
There is a wealth of information here for organizations building a usability practice area. Thanks very much for sharing all the excellent insights Kathy.

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Fredrickson Communications eZine - June 2009

by J Hruby, Vice President, Sales & Marketing

In this edition of the Fredrickson eZine . . .

Danger! Economic recovery ahead. by J. Hruby Fredrickson Communications Marketing Director

I hate to pile more bad news onto what has essentially been two years of almost constant economic bad news, but here it goes: We could be on the verge of an economic recovery.

Bad news? Isn’t a recovery what we’ve been waiting for?

There’s absolutely nothing wrong with an economic recovery. At the same time, like any economic event, those who are most prepared will reap more of the benefits. How many leaders of learning organizations, or leaders of any other area in business for that matter, have spent any time at all preparing for an upcoming economic recovery?

As Shakespeare’s Hamlet said, “Ay, there’s the rub.”

Over the past two years, an inordinate amount of time and energy has been devoted to planning for and reacting to an ever-deepening recession. Learning-related groups on social networking sites like LinkedIn and LearningTown have been filled with discussions about how learning, training, and development professionals have been handling the recession.

Yet as the recovery looms ever larger, there’s been a severe shortage of discussion about how to plan for a recovery. Read more . . .

Is your learning organization recovery-ready? Join the discussion.

As part of the 2009 Learning Leadership Summit, we’ve started a blog to get the discussion rolling.

What are your fellow learning leaders and peers doing to prepare for the coming recovery? What are the key questions that learning leaders should be thinking about during the recovery? Where do I start if I haven’t even thought about what to do?

Join the discussion on the Learning Leadership Summit blog.

Need a speaker for your gathering or event? by J. Hruby Fredrickson Communications Marketing Director

If you’re planning a team meeting, company or agency conference, industry gathering, or other event and you need a speaker or seminar leader, Fredrickson Communications can help. Let our Fred Comm consultants provide a useful and memorable seminar or presentation for you event.

Our consultants have a wide range of experience in areas related to learning and development, usability, and user-centered design. Here are a few of the speakers we can provide and the topics they can address:

  • John Wooden – John is Fredrickson’s best-know and most-in-demand speaker. He informs and entertains hundreds each year with his seminars about usability, user-centered design, social media, social learning, and other topics related to the future of technology. John can provide seminars and presentations of all lengths including keynote addresses.
  • Jay Kasdan – Jay is Fredrickson’s expert on measuring and evaluating the role training plays in achieving business results. He has delivered seminars to audiences of all sizes on topics related to training’s role in maximizing the business results of system implementations.
  • Robin Lucas – As Fredrickson’s Director of Project Management, Robin shares a wealth of information on a range of topics around better managing the business of learning.
  • J. Hruby – J’s interest is in leadership of the learning function and the effective use of technology to solve problems within learning organizations.
  • Jill Stanton – Jill is Fredrickson’s expert on the best practices for virtual classroom training. As the use of virtual classroom sessions continues to accelerate, the need to improve the effectiveness of this delivery method has also come to the forefront, making this a timely topic for many groups.

To get a feel for the perspective a speaker from Fred Comm can bring to your meeting or conference, take a look at the articles on the Fredrickson website. Contact us and we’d be happy to discuss providing a speaker or seminar leader for your event or gathering.

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Let’s hear from you. Post your comments, questions, or other feedback on this article to the Learning Leadership Summit blog.

Danger!  Economic Recovery Ahead.

by J Hruby, Vice President, Sales & Marketing

I hate to pile more bad news onto what has essentially been two years of almost constant economic bad news, but here it goes: We could be on the verge of an economic recovery.

Bad news? Isn’t a recovery what we’ve been waiting for?

There’s absolutely nothing wrong with an economic recovery. At the same time, like any economic event, those who are most prepared will reap more of the benefits. How many leaders of learning organizations, or leaders of any other area in business for that matter, have spent any time at all preparing for an upcoming economic recovery?

As Shakespeare’s Hamlet said, “Ay, there’s the rub.”

Over the past two years, an inordinate amount of time and energy has been devoted to planning for and reacting to an ever-deepening recession. Learning-related groups on social networking sites like LinkedIn and LearningTown have been filled with discussions about how learning, training, and development professionals have been handling the recession.

Yet as the recovery looms ever larger, there’s been a severe shortage of discussion about how to plan for a recovery. There seems to be a planning disparity whereby a recession is an event that prompts specific thought and strategic planning, but an economic recovery is supposed to take care of itself.

The question of how to respond to a recovery that’s gathering steam will be a very real business problem as we progress through 2009. True leaders are built for change. Dr. John Kotter, author of the bestseller Leading Change, puts it this way in his book What Leaders Really Do:

“Management is about coping with complexity. … Leadership, by contrast, is about coping with change. Managers promote stability while Leaders press for change; and only organizations that embrace both sides of that contradiction can thrive in turbulent times.”

As with any change in business, the best time to have a plan in place is before you need to act on it, and planning takes leadership. It might seem especially counterintuitive now, after so many months of bad news, but the best time to plan for the learning organization’s role in the coming recovery is now—while we’re still in a recession. Why? Follow me.

A brief history of hard times

To understand why planning for a recovery is worth thinking about now, we need to look at the history of recessions and the signs of recovery. Like economists, history can’t give us an exact answer to the question of when the recession will end. But if we combine the historical view with recent signs of a coming recovery, we can make a strong case for better times being on the horizon and therefore the recovery planning clock is ticking.

First, the historical view. According to those who have the dubious distinction of tracking everything recession-related—The National Bureau of Economic Research (NBER)—the current recession officially began in December of 2007. So the Great Recession is now about 17 months old.

This begs the question, how long does the average recession last? It depends on what period of time you consider, of course. If we look at recessions from 1900 to present, the NBER data show that the average recession lasts 14.4 months. So this recession is worse than the average for the past century, but it’s still not even close to the Great Depression, which lasted for 43 months. Despite the dire predictions from cable news pundits over the past year, there is a strong consensus that we are not destined to repeat the Great Depression. At least not this time.

Taking a step beyond just historical averages, there is a case for optimism because consensus is building that the end of the recession is in sight. If the recent news is still not exactly good, then at least it’s becoming not-as-bad. The view of Federal Reserve chairman Ben Bernanke is that the current recession will begin to ease in late 2009. This has given Wall Street a boost, the crisis in the banking and financial sector has begun to stabilize, the gloomy real estate market is showing signs of life, and job losses have slowed.

Interpreting these events is still an exercise in reading tea leaves, but the doom-and-gloom is gradually being replaced with a view that even if we aren’t yet on the upswing, at least we’ve found the bottom.

Regardless of whether you believe the end of the recession is two, six, or ten months away, that still leaves the fact that like a recession, an economic recovery is a significant business event and deserves careful planning. As a leader in the business of learning, do you have a plan for when happy days are here again?

Ready for recovery?

Unfortunately, during recessions training and development groups tend to take their share of the hit. These cutbacks come in the form of reduced headcount, reduced budgets, reduced training offerings, and reduced external spending—or a combination of all of these.

According to the Learning Resources Barometer survey conducted by The MASIE Center in March 2009, 62% of learning organizations have experienced budget cuts. The same survey shows that 36% of respondents say the size of their learning group has decreased, while only 13% have seen their group size increase.

For companies who serve these learning groups, the news isn’t any better. A full 60% of respondents say their organizations have reduced spending on external services either moderately or substantially. This means the pain has been shared by everyone who has a stake in the business of learning and development.

During tough economic times the discussion centers on what to do to preserve the training function and minimize the impact of the recession. There have been countless articles and endless time given over to discussions of the what-to-do-during-a-recession variety. The general advice centers around becoming part of the strategic planning process, working on high-profile initiatives linked to helping the company through the tough times, staying visible, and demonstrating the value of training in terms of business results.

In our post-recession relief, we need to grant equal time to an equally important discussion: what should learning leaders do to help their organization respond to an economic recovery?

An upturn for learning

The response to an economic upturn will be different for each organization, but as you begin to build your recovery strategy for learning, consider the following questions:

How will the organization’s overall strategy and key priorities change when the recovery starts?
Of course, the related question for the learning leader is how will your group’s priorities be aligned with these changes? The clear alignment of business and learning priorities is always important, but it takes on even more significance coming out of a deep recession.

Are your learning group’s priorities “shovel ready”?
It’s likely that just as there is competition for the organization’s attention and resources in lean times, there will be an equal amount of competition as the recovery takes hold. Resources may become less constrained, but there will be lots of competition in the form of pent-up demand.

Having a clearly defined strategy and set of priorities, as well as having the business case ready for these priorities, will help speed the process of obtaining the resources your group will need to scale up to the demands of more prosperous times.

What should your learning organization look like after the recovery?
There’s no way to dance around the unfortunate reality that this recession has decimated many learning organizations. As difficult as that reality has been, it presents both a challenge and an opportunity for leaders who are faced with rebuilding a hard-hit learning organization.

For some learning groups, the goal will be to simply rebuild what they had before, but for others this will be an opportunity to evaluate the future needs and bring in skill sets that are better aligned with the new direction. This is another reason why planning for the recovery is important. To make strategic hiring decisions, you first need a strategy.

Own or rent?
This isn’t in reference to real estate — it’s about staffing your learning group. While you’re deciding what your new organization will look like, ask yourself these questions for each role you’re considering staffing:

  • Does it really make sense to have an employee doing this?
  • Would it be done more efficiently and effectively by outsourcing it or by bringing in specialized staff on an as-needed basis?
  • Is it likely that I’ll be able to hire someone with the skill set I want and at a salary that I can afford both now and in the future?
  • What do I do to fill the need while I conduct the search?

For some roles, it may make sense to set aside the model of having your learning group be comprised solely of employees. Skill sets like LMS administration have become highly specialized and technical while at the same time being prone to workload peaks and valleys. It may make more sense to contract for this skill set on an as-needed basis rather than make it someone’s full-time job.

Time to get (re)organized?
This also may be the time to take on the difficult battle to either centralize or decentralize learning functions within your organization. Neither of these reorganizations is an easy transition, but they won’t get any easier as the pace of the recovery quickens and the focus turns from cutting costs and headcount to maximizing sales and profits.

Does it still make sense?
For some initiatives like cutting costs and maximizing efficiency, the answer will still be a resounding “yes.” If you embarked on a mission to replace a travel-intensive, high-cost classroom training curriculum with virtual classroom and eLearning offerings during the recession, why stop now? Qualities like efficiency and effectiveness never go out of style. If you had a good strategy during the recession, the new strategy shouldn’t involve a total overhaul. Some good ideas are independent of the prevailing economic wind.

Did you remember to remember?
Hopefully, the coming recovery will lead to a long period of economic growth. Just how long, however, is anyone’s guess.

Make a commitment to writing down the lessons learned and the things you wish you would have done during this recession, so you can refer to them or share them with others during the next period of challenging times. Experience is one of the most valuable tools that leaders have, and unfortunately this recession has taught us many tough lessons. These will be the lessons most worth remembering.

The recovery starts now

This issue of planning for the recovery should take on even more urgency because this recovery will certainly not be like past recoveries. Many industries have seen massive consolidation and the competitors that have survived will be hungry. They could also be in a stronger competitive position due to acquisitions, consolidations, or other factors. There will also be many changes to the legal and regulatory climate in many industries. The financial sector, for example, will have to adapt to new regulations for years to come, and training will certainly need to be a part of that process of change. Essentially, the only thing we really know about the coming recovery is that for most organizations it will not be simply a matter of resuming business as usual.

Just as a recession presents challenges, so too does a recovery. Having a planned response ready in advance will always be an advantage. Learning organizations that are (or want to become) a part of their company’s competitive advantage will need to quickly transition from survival mode to a position where they can rapidly engage in helping their organization make the most of the recovery ahead.

Now let’s hear from you. Post your comments, questions, or other feedback on this article to the Learning Leadership Summit blog.

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Fredrickson Communications eZine - March 2009

by J Hruby, Vice President, Sales & Marketing

In this edition of the Fredrickson eZine . . .

Going Beyond Bullets: Creating Engaging eLearning Courses with Articulate Studio

by Tony Tao
Fredrickson Communications Instructional Designer

Articulate Studio is a suite of rapid eLearning development tools that allows users to create professional and interactive eLearning courses without intensive programming skills. At Fredrickson Communications, we’ve seen and helped many of our client organizations adopt this tool because of its efficiency and relative ease of use.

The complete Articulate Studio package includes: *Presenter – Converts your PowerPoint slides into a Flash presentation. *Engage – Helps to create engaging interactions for your course. *QuizMaker – Develops graded assessment or non-graded surveys. *Video Encoder (‘09 version only) – Converts full-motion video clips for use within courses.

Articulate Studio packs a lot of functionality in one package, but having a powerful authoring tool is just a good start. Creating an effective eLearning course requires solid knowledge and skill in the areas of instructional design, graphic design, and usability in addition to the ability to use the development tool.

Read more . . .

Got Compliance?

by J. Hruby
Fredrickson Communications Marketing Director

In these times of ultra-lean staffing, many of our learning and development clients tell me that projects are being added to their to-do list at a pace that’s faster than they are checking off the ones they’ve completed. In many organizations this puts training groups in a bind. They need to take on roles in urgent business projects linked to revenue growth and cost reduction. But that leaves few (if any) resources left to attend to the ongoing business functions of the training group.

One of the functions that the training group provides in many organizations is fulfilling and tracking various compliance requirements. Training groups are responsible for everything from government-mandated training to industry certifications to managing continuing education requirements. Organizations also look to their training group to provide the technology to keep compliance-mandated records.

In these lean times it can be difficult to separate the urgent from the important, but compliance is truly in the “important” category. Don’t let your compliance guard down for even a second.

Nuts-and-bolts requirements like tracking course attendance or making mandated information available can seem like minor concerns given the current economy. But they still need to be done. Regulators and auditors generally won’t accept a recession as an excuse for a compliance or record-keeping lapse. At the very least, sorting out a compliance issue can eat up valuable time and resources that are already spread thin. However if a compliance lapse is discovered after an accident or during an audit, there won’t any opportunity to fix the problem.

Fredrickson has helped many organizations with their compliance needs. We can provide everything from training development and delivery, to documentation, to LMS support. We also can design and build applications to track compliance requirements. .(JavaScript must be enabled to view this email address) and we’d be happy to discuss how we can help.

Tony Tao

Tony Tao develops eLearning courses using authoring tools ranging from Adobe Dreamweaver and Flash to rapid development tools such as Adobe Captivate and Articulate Studio. He also works with Fredrickson’s clients in roles that include instructional design, content development, visual design, and project management.

Before joining Fredrickson Communications, Tony worked as an instructional designer and training specialist in several organizations and companies in both the US and China. Tony received his MS degree in instructional design and training at St. Cloud State University. His research focused on asynchronous eLearning practices and development tools.

Going Beyond Bullets: Creating Engaging eLearning Courses with Articulate Studio

by Tony Tao, Instructional Designer and eLearning Developer

Articulate Studio is a suite of rapid eLearning development tools that allows users to create professional and interactive eLearning courses without intensive programming skills. At Fredrickson Communications, we’ve seen and helped many of our client organizations adopt this tool because of its efficiency and relative ease of use.

The complete Articulate Studio package includes:

  • Presenter – Converts your PowerPoint slides into a Flash presentation.
  • Engage – Helps to create engaging interactions for your course.
  • Quizmaker – Develops graded assessment or non-graded surveys.
  • Video Encoder (‘09 version only) – Converts full-motion video clips for use within courses.

Articulate Studio packs a lot of functionality in one package, but having a powerful authoring tool is just a good start. Creating a high-quality eLearning course requires solid knowledge and skill in the areas of instructional design, graphic design, and usability in addition to the ability to use the development tool.

As the functionality and features of the Articulate products have increased, so too have the demands they place on the instructional designer and developer. I’d like to share some of my experiences in using Articulate Studio to develop courseware for our clients. Often, our clients are subject matter experts who have been asked to create eLearning courses as an alternative to classroom training. For the most part, they are new to eLearning development and Articulate is the only tool they’ve been exposed to.

My suggestions in this article are intended for those who are new to eLearning development and are using Articulate Studio as their development tool of choice. I hope you find these helpful and time-saving.

Keep the pace in mind

Creating a self-paced eLearning course requires a systematic approach to deliver the content. In classroom training, there is real-time communication that allows the instructor to adjust the pace of the learning. An eLearning course also needs to provide the ability for the learners to control their learning pace, and to react to and absorb the content.

To add pace to an Articulate course, the designer needs to put more breaks among the key learning objectives, and systematically reinforce the learning outcome by using interactions and quizzes.

Some examples are:

  • Add Engage interactions, short quizzes, or learning games in the middle of long sections of content.
  • Use visual aids and animations to explain complicated concepts.
  • Add a short summary at the end of each lesson to recap the learning objectives.

Because Articulate makes it very easy to add a lot of written content to a course (by way of easy importation of PowerPoint slides), there is a temptation to overload the course with written content.

Place yourself in the learner’s seat, and answer these questions:

  • Is the content comprehensive enough to support the learning objectives?
  • Is the pace too fast or too slow?
  • Is the “seat time” too long?
  • Is the instruction clear for using the navigation buttons and features?

It is always helpful to run a pilot test with some of your target learners and ask for their feedback and suggestions. Usually, piloting the course with a small group of testers will reveal any problems that exist with the course’s pace, flow, seat time, and usability.

Write for narration

One of the amazing features of Articulate Studio is the ease of synchronizing the animated bullets with audio narration. However, just because Articulate makes it technically easier doesn’t let the instructional designer off the hook.

Writing for narration can be a difficult task. Writing content for use as an audio script is very different from writing content that is to be read by the user.

Here are some useful tips in writing a good audio script:

  • Avoid long sentences and jargon. For the narrators, short sentences are easier to read, and therefore it saves a lot of time in recording due to fewer mistakes.
  • Use transition languages to create a bridge between topics and lessons.
  • Read your content. If you are not able to get it through smoothly, your learners may have difficulties following it too.

When recording your narration, read it confidently and with expression. Narration is a skill by itself and it takes practice. If you don’t have confidence in your narration quality, consider using professional narrators.

Use visual design to your advantage

Good visual design can dramatically improve the appeal of your course and it’s also important for usability. Keep in mind that simple design is often the best design.

For example:

  • Use consistent color and font style for your content.
  • Use light color on the slide background.
  • Apply the same treatment over the graphics.
  • Put simple bulleted lists on your slides instead of long paragraphs.
  • Use different slide layouts to regain learner’s attention.

Be creative

Yes, you can create a very sharp course using Articulate Studio, but only if you can jump out of the linear nature of PowerPoint slides.

Consider using the same techniques that we use in a comprehensive eLearning course. For example, create role-based characters, themes, and branching scenarios. Using these elements will create a different learning experience, and will make your course more engaging to learners.

Articulate Studio offers many tools to help author creative and engaging courses, but as I wrote in the introduction, as the software becomes more feature-rich, the demands placed on the developer and instructional designer become greater. Rapid eLearning tools are often sold based on the premise that using the software, “anyone can create eLearning.” While that may be partially true, I would argue that it still takes a considerable amount of skill to create good eLearning. As the development tools become more sophisticated, it’s natural that it will take even more skill to use them well.

To summarize, you can create a great eLearning course in Articulate by using appropriate instructional design approaches, writing style, and visual design. And, trust your creativity! I will share more of our hands-on experiences of using Articulate Studio in future articles.

Jay Kasdan

Jay Kasdan joined Fredrickson Communications in 2004. He has over 15 years of experience designing, developing, and implementing training programs. He also specializes in measuring performance solutions and training effectiveness, and he has developed training assessments for all four levels of Kirkpatrick’s training evaluation model.

While Jay was a training manager for Deluxe Corporation’s American SAP implementation, his group won the prestigious National Impact Award for SAP implementations from the Americas’ SAP Users’ Group (ASUG).

Jay holds an MS in Vocational Education from the University of North Dakota.

Developing Training and Measurement Strategies that Produce Business Results

by Jay Kasdan, Project/Account Manager

As training professionals, we need to become experts at asking the right types of questions to develop performance solutions for our clients. This is what led me to develop the Fredrickson Business Initiative Strategic Questionnaire. For those of you that want to learn more about this questionnaire, plan on attending my session, Project Implementation: The People Component of Reaching Your Business Goals, at the Minnesota IT Government Symposium on December 10, 2009 from 11:00 a.m. to noon.

Surprisingly, most training and measurement strategies fail before they even have a chance to succeed. The reason is that many trainers fail to ask the most important training question:

“What are your business goals for the project/initiative?”

It’s no coincidence that this is the first question on my Initiative Questionnaire (PDF). Let me explain why.

The most important question

Defining the business goals for any initiative would seem like an obvious step—and something that should be done long before training becomes a part of the initiative. However, many times trainers simply don’t ask about the business goals. And when they do? Consider these real-life stories.

The first story is one I’ve heard dozens of times from people using the questionnaire. During the initial meeting with the project sponsor to discuss the training request, the trainer or training manager asks, “What are your business goals for the project/initiative?”

The response tends to be an uncomfortable silence followed by a promise to get back to the trainer soon. Surprising? Absolutely, but it happens more often than most of us would believe.

Sometimes it emerges that while there may be a goal, it’s not very well thought out from a business perspective. This was illustrated to me by a trainer from a major retail company who told me the story about asking the question and getting an answer that was anything but a business goal. The answer the trainer received? “We want people to feel better after the training.”

While it’s admirable that they wanted people to feel better, it wasn’t the type of business goal one would normally base an initiative on. So the trainer and the business client agreed that the question needed more thought. After more discussion, the result was significantly different: To decrease a specific behavior that was leading to a safety concern.

Think of where they started and where they ended in defining the business goal. Now think of the difference this made in the training and the ultimate business result. That is one powerful question.

Sometimes the question can help your clients help themselves. I was working with one of our major manufacturing clients to redesign their three-day class on their software development methodology . As we discussed the business goals, I learned that one of the most important elements of meeting their business goals was the development of a list of expected productivity gains and cost savings that would be achieved when the system was fully implemented. The plan was to monitor and maximize system use after implementation.

Unfortunately, the current three-day course had less than five minutes dedicated to this concept. Anyone like to guess what major change we made to the instructional design? Exactly.

The power of acceptance

Unfortunately, many initiatives with clearly stated business goals still fail. Why? There are a variety of reasons, but one that is often overlooked is the power of acceptance.

In the book, Making Six Sigma Last, George Eckes introduces a simple equation for change:

Quality × Acceptance = Effectiveness

Most initiatives focus their time, effort, and resources on the quality component of the equation. Yet they fail to develop a plan that will effectively move the outcomes of the initiative into the daily work procedures of the organization.

The problem is that achieving the highest level of system quality—let’s say a “perfect” score is 10—can be still be negated by a low level of acceptance. Using the Quality × Acceptance = Effectiveness equation, 10 × 0 = 0.

Thus the overall effectiveness of the initiative can be dramatically limited or even eliminated by a poor level of acceptance. As trainers, how can we develop training and measurement solutions that will help move the initiative through acceptance to measurable business results?

The four areas and 20 questions of the Fredrickson Business Initiative Strategic Questionnaire will help guide project managers, senior leaders, and trainers to identifying key information and gaps as they develop their solution. The questionnaire covers four main topics:

  • Business results
  • Processes, skills, and tasks
  • Proficiency
  • Implementation, measurement, and accountability

The remaining questions in the questionnaire will be covered at my seminar at the Minnesota IT Government Symposium on December 10, 2009 from 11:00 a.m. to noon. Complete information on the Symposium, including a full schedule and registration information is available here. I look forward to seeing you at the session.

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Fredrickson Communications eZine - February 2009

by J Hruby, Vice President, Sales & Marketing

In this edition of the Fredrickson eZine . . .

Tapping into the Wisdom of the Crowd: Making it Work

Part two of a two-part series
by Josh Welsh
Fredrickson Communications Usability Analyst

In part one of this series about online collaboration I wrote about several Do’s and Don’ts to keep in mind when using wikis as collaboration tools. I covered these key points:

  • You can’t expect large numbers of collaborators to solve everything.
  • You will need to provide structure for your collaborators.
  • Eighty percent of the work will be done by twenty percent of the contributors.

Now we’ll take a look at three ways to help structure online collaboration in an enterprise environment. I found these principles in Steven Weber’s The Success of Open Source, which describes the open source software movement from its origins in the development of the UNIX operating system in the early 1980s. Weber offers several principles for collaboration. I find the following three to be especially relevant for large-scale organizations interested in using online collaboration tools:

  • Make it interesting.
  • Make it meaningful.
  • Make it transparent.

Read more . . .

Let Fredrickson help with the small stuff – Don’t let the technical hurdles stop you

by J. Hruby
Fredrickson Communications Marketing Director

The current economy places a lot of pressure on organizations of all kinds these days. This means the learning groups that serve these organizations are also under pressure.

Learning groups need to develop and deliver effective learning programs that improve performance now—when it’s needed most. And they need to do so quickly and efficiently. They also need to keep their learning organizations running smoothly and to be recognized for delivering a high-quality, valuable services to the organization.

But Learning Management Systems need maintenance and support, older eLearning courses need to be kept current, and new learning products need to be developed quickly and with fewer resources. All this comes at a time when staffs are shrinking and technical skills may be in high demand and short supply.

Many of our clients know Fredrickson can help with larger-scale projects, but we can also provide many individual technical services to help your learning group keep pace. We can:

  • Develop individual Flash interactions to fit within eLearning courses
  • Design and develop online job aids or process tools
  • Update eLearning courses
  • Fix broken eLearning courses
  • Troubleshoot LMS issues
  • Provide an LMS administrator or back-up administrator
  • Convert eLearning courses that were developed using obsolete tools or technologies
  • Add LMS wrappers to existing courses
  • Update or modify intranet sites
  • Set up processes for source file control
  • Create asset libraries to organize sound and image files used in eLearning courses

Could these or other technical services help you? .(JavaScript must be enabled to view this email address) and we’d be happy to discuss how we can help.

Networking Opportunities – Real life and virtual ways to connect with your learning and communication peers

by J. Hruby
Fredrickson Communications Marketing Director

One of your most valuable career resources, in good times and bad, is your personal network. Whether it’s a virtual network of online peers, or a group you meet face-to-face, these are the people who can help you by answering questions, sharing experiences, or introducing you to others who can help you in your career.

Here are just a few of networking organizations both in the Twin Cities and online that I think learning and communications professionals will find helpful:

  • PACT (Professional Organization of Computer Trainers) – No longer just about computer training, this is a unique organization for technical training professionals in the Twin Cities. Find out more at the PACT website.
  • Corporate University Roundtable – An organization dedicated to the needs of those who hold leadership or management positions in corporate learning organizations. Learn more by visiting the Roundtable’s website.
  • Learningtown – A very active virtual community for learning professionals.
  • Intersect: Where public-sector communications and technology professionals meet. Intersect is a form for those in public sector communications and related IT roles in Minnesota. Intersect members come from all levels of government to discuss issues related to using technology to communicate. See the Intersect web page for more information.
  • LinkedIn – There are many learning-related groups that are part of the networking mega-site LinkedIn. You need to be a LinkedIn member to join the groups.
    My favorites are:
    – Learning, Education and Training Professionals Group
    – Workplace Learning and Performance Forum
    – Training Professionals

Josh Welsh

Josh Welsh has a variety of experiences writing and editing scientific and technical documentation, and he also works on usability and user interface design projects for Fortune 500 companies and the public sector. Before joining Fredrickson Communications, Josh worked as a technical writer and a public radio reporter, and he has taught English to German high school students.

Josh is finishing an MS degree in scientific and technical communication at the University of Minnesota. His research interests lie in the nature of asynchronous, “low-structure” online collaboration. In his research, he asks questions such as, “How do contributors to wikis and open-source software projects work together?” and “How does the changing relationship between author and audience affect the work of technical communicators?”

References
  • Social Learning Technologies Matrix(.pdf) – A concise guide to social learning technologies and their applications.
  • Weber, S. (2004). The Success of Open Source. Cambridge, MA: Harvard University Press.
  • The GNU General Public License agreement, or GPL, is most widely used open-source license agreement. It was created by Richard Stallman in 1989 to facilitate the sharing and development of open-source software.

Tapping into the Wisdom of the Crowd: Making it Work

by Josh Welsh, Usability Analyst

In part one of this series about online collaboration I wrote about several Do’s and Don’ts to keep in mind when using wikis as collaboration tools. I covered these key points:

  • You can’t expect large numbers of collaborators to solve everything.
  • You will need to provide structure for your collaborators.
  • Eighty percent of the work will be done by twenty percent of the contributors.

Now we’ll take a look at three ways to help structure online collaboration in an enterprise environment. I found these principles in Steven Weber’s The Success of Open Source, which describes the open source software movement from its origins in the development of the UNIX operating system in the early 1980s. Weber offers several principles for collaboration. I find the following three to be especially relevant for large-scale organizations interested in using online collaboration tools:

  • Make it interesting.
  • Make it meaningful.
  • Make it transparent.

These principles have become apparent as groups working on open-source projects have collaborated over the years. You will need to create policies, guidelines, and work practices to ensure adherence to these principles in your own documentation process.

Make it interesting

In a truly open collaboration environment, contributors to a project choose what they want to work on. Volunteers can gravitate toward aspects of a project that interest them—this is one of the main reasons why so many people are willing to volunteer their time at all. On the other hand, they can also leave the project at any time, and that kind of freedom may not be practical in the workplace. But if you want to harness the power of your hive, you should look for ways to make the process of contributing as interesting as possible for your worker bees.

Make it meaningful

Weber describes this principle as “scratching an itch.” What he means is that developers often contribute to open source software projects because it helps them solve a tangible problem they have been dealing with.

Maybe it’s a hobbyist at home trying to build the ultimate personal video recorder, or maybe a programmer is looking for a dynamic way to display database contents for a project at work. In cases like these, the person may work on an open source project that helps solve a problem, and they are happy to share the solution with others. In a documentation setting, there may be people in your organization who repeatedly answer the same kinds of questions and would love to help document the answer for everyone.

Make it transparent

Another way to put this is, “Talk a lot.” This is one of the keys to the success of the Linux operating system. Disputes take place in public. Anyone is welcome to weigh in on an issue. Discussion is not always calm. And the records of the discussions remain available for anyone to refer to.

In open source development, this can lead to bruised egos and can even contribute to software forks (where a group of developers leave their fellow contributors to take the software in a different direction). Again, in the enterprise environment, complete transparency may not be an option. But the more you can keep decisions and discussions open to the all of contributors to a project, the more those contributors will feel they have a voice and stake in the final product.

Conclusion

In the end, collaborative documentation in an enterprise environment may never be able to follow the same chaotic development model that has made truly open source projects so successful. In fact, much of the strength of Linux, Firefox, and other free software lies in the flexibility and openness of the GNU General Public License agreement, which allows anyone to take the material and reuse it for any purpose. (The main restriction is that derivative products must also be distributed under the same licensing agreement.) Clearly, this kind of free distribution may not be an option for all organizations.

However, to the extent that you can create a structure to help mimic the kind of emergent collaboration that has evolved around these open-source projects, you will be able to tap into the wisdom of the crowd in your organization.

Josh Welsh

Josh Welsh has a variety of experiences writing and editing scientific and technical documentation, and he also works on usability and user interface design projects for Fortune 500 companies and the public sector. Before joining Fredrickson Communications, Josh worked as a technical writer and a public radio reporter, and he has taught English to German high school students.

Josh is finishing an MS degree in scientific and technical communication at the University of Minnesota. His research interests lie in the nature of asynchronous, “low-structure” online collaboration. In his research, he asks questions such as, “How do contributors to wikis and open-source software projects work together?” and “How does the changing relationship between author and audience affect the work of technical communicators?”

References
  • Social Learning Technologies Matrix(.pdf) – A concise guide to social learning technologies and their applications.
    By John Wooden, Fredrickson Communications Director of Usability Services.
  • Glass, R. L. 2007. “What’s with this blog thing?” IEEE Software. 24(5), 104-103.
  • Leadbeater, Charles. 2008. We-think. London: Profile.
  • Weber, Steven. 2004. The Success of Open Source. Cambridge, MA: Harvard University Press.

Firefox® is a registered trademark of the Mozilla Foundation. Linux® is the registered trademark of Linus Torvalds in the U.S. and other countries.

Crowds, Wisdom, and Work: Do’s and Don’ts for Documentation Through Online Collaboration

by Josh Welsh, Usability Analyst

Wikis and other online collaboration tools have been around for several years now. Wikipedia is the best-known example of a wiki, and its growth has been nothing short of tremendous. From the time it was launched in 2001 until 2007, Wikipedia grew by an astonishing 19 million percent. As wikis have grown on the World Wide Web, businesses have looked for ways to implement this tool at the enterprise level. Several of our clients use wikis for collaboration, as do my colleagues here at Fredrickson Communications.

But how can you cut through the hype and make the best possible use of this tool? Is it enough just to install wiki software on your intranet and tell your employees to start writing? Unfortunately, all of the benefits and challenges of enterprise-level mass collaboration have yet to be fully explored. However, based on research I’ve done at the University of Minnesota, I can offer early adopters of enterprise-level mass collaboration tools some do’s and don’ts when implementing these technologies in the workplace:

  • Don’t expect large numbers of collaborators to solve everything.
  • Do provide structure for your collaborators.
  • Do remember that the 80/20 rule still applies.

Don’t expect large numbers of collaborators to solve everything

I analyzed 50 random Wikipedia pages to see if there was a correlation between the number of editors that worked on an article and the article’s readability according to the Flesch Reading Ease scale. I was surprised to find almost no correlation between these two factors. On the organizational level, these two variables may not be significant, since readability can be judged in a number of ways and the organization may have a very limited set of potential contributors. However, the takeaway is clear: a large numbers of editors will not necessarily create consistent, readable documentation. If consistency is a goal, you’ll have to follow the tip below.

Do provide structure for your collaborators

In the popular mind, wiki collaboration is sometimes thought of as completely without structure. Anyone can make any change they wish, at any time, completely anonymously. This may leave readers of wikis asking themselves, “Do I really trust the writings of a wiki contributor, knowing absolutely nothing about his or her qualifications for instilling some semblance of expertise into a wiki?” (Glass, 2007)

However, successful open collaborations do not develop without structure. When people collaborate on a large scale using tools such as a wiki, they self-organize into certain roles, such as author, editor, tester, and user. Likewise, in successful large-scale open source programming projects, such as the development of the Linux kernel, the contributors organize themselves into roles and responsibilities. Some people work on bug fixes, approve those fixes and send them out for testing, and others make the final decision on when the fix is ready to be put into the next release. Still others work on the cutting edge, creating new features for future releases. Since you are not likely to have the critical mass required for this kind of self-organization, you need to help create this structure. Who will write? Who will edit? Who will review and approve? Who will maintain the resource over time?

Do remember that the 80/20 rule still applies

Finally, realize that work in open environments does not distribute itself evenly. Following the Pareto principle (the 80/20 rule), it’s likely that 80 percent of the work in an open collaboration project will be done by 20 percent of the people. Don’t expect that once you set up a wiki and establish a structure for your collaboration team, all of the team members will put forth the same amount of effort. The strength to truly open-source projects comes from their flexibility: team members contribute however much they want, whenever they want to do so. This may not be practical for the goals you have for your wiki, but ignoring the 80/20 rule won’t make it go away.

Conclusion

Online collaboration does promise a great deal. And it has a proven track record, as evidenced by the success of open-source software projects such as the Firefox® web browser and the Linux® operating system. But to make the most of it, organizations need to remember that it is not a panacea. You’ll still need to provide structure and leadership for it to work to its fullest potential.

In part two of this article, we’ll take a look at some specific best practices that have been proven by the success of the open-source software movement:

• Make it interesting.
• Make it meaningful.
• Make it transparent.

Jill Stanton

Jill Stanton joined Fredrickson Communications in 1999. She has been in the documentation and training field for 12 years. While at Fredrickson, Jill has served as an information developer, instructional designer, trainer, and project manager.

Jill earned her first BA in English, Theatre, and Secondary Education from Hamline University. She then obtained a BA in Psychology from Augsburg College.

Park Nicollet and Fredrickson Win STC eLearning Award of Excellence

by Jill Stanton, eLearning Lead

The Twin Cities Community of the Society for Technical Communication (STC-TC) has given a 2008 Award of Excellence to the Park Nicollet Pandemic Influenza Preparedness at Work and at Home eLearning course. This course was designed and developed by Fredrickson Communications in partnership with Park Nicollet. The STC’s Award of Excellence is given to a course that “consistently meets high standards in all areas.”

The Pandemic Influenza Preparedness at Work and at Home eLearning course is a highly interactive experience that teaches health care professionals how to prepare for and react to a worldwide influenza outbreak. This course brings into focus both the workplace and personal implications of an outbreak. The course is designed to increase health care professionals’ knowledge and awareness of the rigorous demands that a flu pandemic would place on them.

At Fredrickson Communications we’re very proud of our role in helping our client create this award-winning course and we congratulate Park Nicollet and thank them for their partnership on this project. If you would like to see a demonstration of this course, please use the information on the Contact Us page to get in touch.

Formal presentation of the award will take place at the STC-TC awards ceremony on May 12, 2009. There is more information about the awards ceremony on their STC-TC’s website.

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Fredrickson Communications eZine - January 2009

by J Hruby, Vice President, Sales & Marketing

In this edition of the Fredrickson eZine . . .

Crowds, Wisdom, and Work: Do’s and Don’ts for Documentation Through Online Collaboration

Part one of a two-part series
by Josh Welsh
Fredrickson Communications Usability Analyst

Wikis and other online collaboration tools have been around for several years now. Wikipedia is the best-known example of a wiki, and its growth has been nothing short of tremendous. From the time it was launched in 2001 until 2007, Wikipedia grew by an astonishing 19 million percent.

As wikis have grown on the World Wide Web, businesses have looked for ways to implement this tool at the enterprise level. Several of our clients use wikis for collaboration, as do my colleagues here at Fredrickson Communications.

But how can you cut through the hype and make the best possible use of this tool? Is it enough just to install wiki software on your intranet and tell your employees to start writing? Read more . . .

Featured Links

by J. Hruby
Fredrickson Communications Marketing Director

I always ask the employees here at Fredrickson Communications to send me links to timely and interesting articles and blog entries. Here are a few recent ones in the realm of usability, learning, and communications that I think you’ll find interesting:

Designing Usable, Self-Paced e-Learning Courses: A Practical Guide by Michael Feldstein, SUNY Learning Network, and Lisa Neal, eLearn Magazine

This is a great primer on the subject of usability as it relates to eLearning courses. eLearning, after all, is a form of software. This article provides a thorough introduction for the eLearning professional who’s interested in understanding how usability can enhance the eLearning user’s experience. Read the article . . .

The Impact of Corporate Culture on Social Media An IBM case study by Adam Christensen

Many of Fredrickson’s clients have expressed an interest in exploring the potential of social media tools to enhance learning and communication within their organizations.

Two issues almost immediately come to the forefront in these discussions:

1. How will our company culture react to this technology?
2. What can be done to select and introduce tools to make them more likely to succeed?

Adam Christensen has some good thoughts on this topic in his blog entry (with a related slide show). Read the blog entry. . .

J Hruby

J. Hruby is Fredrickson’s Vice President of Sales & Marketing and he also works with Fredrickson’s clients to develop learning strategies and related eLearning, training, and performance support products.

J. enjoys writing articles and presenting to professional organizations about issues related to eLearning, user-centered design, and the role of technology in improving performance. He has presented seminars to the local chapter of the American Society for Training and Development (ASTD), the Minnesota Government IT Symposium, and the Society for Technical Communication (STC).

Before joining Fredrickson, J. was a training and quality systems documentation manager for AlliedSignal and Honeywell.

Are You Ready for the New Ruthless User?

by J Hruby, Vice President, Sales & Marketing

In a recent presentation to Minnesota’s LifeScience Alley™ trade association, John Wooden, Fredrickson Communications’ director of usability services, joined me in describing how user experience testing can support and enhance a business’ marketing efforts. One part of the presentation seemed to furrow a few brows: We asserted that the days of the casual website visitor are long gone—if they were ever here at all. The new breed of site visitor is ruthlessly task-driven, aiming to quickly find what they are looking for and move on. We introduced the audience to the New Ruthless User.

OK, so maybe there isn’t really a new breed; it’s just web evolution at work. The emergence of the New Ruthless User (NRU) is the result of the ever-expanding size of the web and the huge amount of content that’s available. If you think about your own experience with Google, the problem is that even a fairly narrow search returns thousands of results, of which only a few are of real interest to you. You might view the rest as web litter that’s cluttering up the path.

But we still need to sort the web wheat from the web chaff, so in order to see which websites really offer what we’re looking for, we have to make quick decisions about whether or not a website offers anything of value. We make these decisions within seconds of scanning a web page.

I’m a ruthless user, you’re a ruthless user

Consider how you use the web, and I think you’ll agree that you, too, have become an NRU. Most of us simply don’t have time to read every web page that you open in order to make objective judgments about the information it contains. Instead, we develop a web-enabled sixth sense. We open a website and our web sense takes over. “Is this site that I’m looking for?” Yes/no, stay/go. We either keep reading or we hit the Close or the Back button. We make the decision in seconds, based on criteria that we don’t really think about or understand. Now that’s ruthless! But that’s the direction the web has pushed us in.

Also, think about your expectation these days when it comes to the web. What’s our response to almost any question these days? Google it! It’s that simple.

We expect—-or maybe it’s even stronger than that; maybe we even demand—-instant access to the exact information we want, exactly when we want it. ELearning guru Elliott Masie calls this “fingertip knowledge,” and increasingly, we expect that whatever we need to know will be at our fingertips, whenever we need to know it. It’s just become an article of information-age faith that the information we want is out there somewhere. Or at least it should be out there. And we should be able to find it by simply typing a few words in a search engine and clicking. Ta-da! Instant knowledge.

I want it all and I want it now!

Now, think about what happens when your expectations aren’t met. How do you feel when you look for information on a website and you just cannot seem to find it?

Let’s say you look everywhere on a company’s website, but you cannot find its address or phone number. What happens? Increasingly, we get frustrated and sometimes even angry. “What kind of company is this? They can’t even have this basic piece of information on their website. Do I really want to expend more effort trying to do business with a restaurant that forgets to put their address on their website? I guess I could call and ask for directions, but forget it. There are plenty of other places to eat.” And the mouse moves toward that dreaded Close button. Click.

Again, that’s pretty ruthless. And it’s also a pretty realistic vision of our expectations of the web, and what we do if we can’t find what we want.

If you have responsibility for a website or web content, you may wonder whether you can do anything to maximize the chances that the NRU will find value on your site—something worth staying for. The answer is “yes,” once you think about and understand the things that catch and hold this New Ruthless User’s attention. There certainly are adjustments that website owners and content creators can (and should) make.

My intention here is not to delve into specific must-dos, but rather to encourage you to look at your site through the eyes of the New Ruthless User. Ask yourself this: How well does our site meet the expectations of our New Ruthless User?

If you’re not happy with the answer to that question, contact us and let’s talk about an improvement plan. There’s too much at stake to let the NRU click the Close button.

Joyce Lasecke

Joyce Lasecke has been finding and nurturing talent in instructional designers and technical writers for almost 30 years. Having interviewed hundreds of people, Joyce is able to quickly assess—and provide feedback—whether the person will find what he or she is looking for at Fredrickson. More likely than not, she’s able to direct a candidate to other resources and companies to help them find a great fit. She’s always open to a conversation with someone who may be interested in the life of a consultant in the workplace learning world.

Stop Guesstimating, Start Estimating

by Joyce Lasecke, President

Note: This article is republished with permission from Intercom, the magazine of the Society for Technical Communication. It was originally published in May 1994 and focuses on creating user documentation concurrently with software development.

If you’ve ever prepared a cost estimate for a project, you know that you can lose sleep over it. Did you think of everything? Have you uncovered the true complexity of the system? You can reduce anxiety by following a process that answers these questions and results in concrete and measurable information on which to base your estimate.

  1. First, you need to define the project thoroughly.
  2. Then, you can calculate project hours using formulas and other historical information from previous projects.
  3. Finally, you need to assess the risks of changes that affect your estimate and adjust it accordingly.

This article explains these steps and provides some specific estimating formulas that may work for your project.

Defining the project

A certain amount of analysis and planning needs to be done before you can estimate the work effort involved. Learning about the audience and the system is the first step. This research will enable you to determine the project scope and prepare an outline of each deliverable.

For example, let’s say that you will be developing a context-sensitive help system. Based on your understanding of the audience’s information needs, you can identify the types of help topics you’re going to provide, such as field help, procedures, and window descriptions. Then, based on your knowledge of the system, you can identify how many of each type of topic you’ll be writing. You’ll base your estimate on this concrete information.

However, the scope of the project is just part of what goes into an estimate. The people involved in the project and the process they follow will affect the hours needed to complete the project as much as — if not more than — the amount of material to be written. Table 1 contains some questions to answer about the process and people. What’s important here is to recognize any elements that are not “business as usual” for your department or company. As I’ll explain later, you’ll need to evaluate those elements to determine whether to adjust your estimate.

If you don’t already have the answers to these questions, ask for the time (and, if applicable, budget) to conduct what we refer to as a “design phase.” This phase can encompass needs assessment, audience analysis, project definition, and prototyping. It results in a blueprint for the deliverables and a work plan for the project. And, of course, it includes the cost estimate for developing the deliverables.

Table 1. Things that affect time needed to complete a project:

PROCESS

System development: What is the development, testing, and implementation plan? Is it progressing on schedule? Is the design stable or changing?

Source material: Are there any written design specs? Are they accurate and up-to-date? Will a test system be available to writers?

Schedule: Is there a reasonable amount of time to complete the project? Or do many tasks have to be compressed into a very short time?

Review cycle: How many reviews will there be? Will someone from the client’s staff serve as a referee to resolve conflicting review comments, or will the writer need to serve this role?

Tools: Have you previously used the hardware and software for documentation? Is technical support available? Have you worked out bugs for importing graphics, creating an index, or creating online help?

PEOPLE

Documentation team: How many team members are there? How experienced are they with the subject matter and corporate culture?

SMEs (Subject Matter Experts): How knowledgeable are they? Will they be available to answer questions several times a week? Are they supportive of the documentation effort?

Reviewers: How many are there? Are their roles defined? Are they likely to do their job?

All parties: Is everyone fairly united over approach and goals? Or is there a sticky political situation?

Calculating the estimate

At our company, we have developed formulas to estimate hours needed to develop content—this accounts for research and writing. We use a different set of formulas for paper materials than for online help. For all the other tasks on a project, we follow some general guidelines that apply whether the deliverables are paper or online.

Formulas for writing paper deliverables

We estimate writing time based on a number of pages. This is fairly standard among companies that estimate projects. We use a range of three to five hours per page for a two-review cycle. Consider the content when deciding on the hours-per-page formula. We have found consistently that step-by-step procedures require more hours per page than most reference information, so we use a different hours-per-page ratio for each type of information.

Formulas for writing online help

In the past, we used our hours-per-page formulas for estimating online help development. After gathering four years of historical data about hours needed to develop help systems, we have come up with the hours-per-topic formulas shown in Table 2. If you have no historical information of your own, feel free to try using these formulas. However, it’s important to come up with formulas that reflect your work situation.

Table 2. Examples of hours-per-topic formulas:

Type of topic Guideline hours
Step-by-step procedures 4 to 5 hours per procedure
Glossary terms and definitions 0.75 hours per term
Reference topics, such as explanations of concepts or theories, overviews, and product information 1.5 to 2.5 hours per topic
Window descriptions that cover function and navigation but do not include field descriptions Per window:
  • 3 hours—text only
  • 4.5 hours—includes screen capture and cleanup
  • 6 hours—includes screen capture and cleanup, scaling the images, and/or applying hot spots

Field descriptions 1 hour per field
Button descriptions 0.25 hours per button—includes the graphics of the button
Contents window

Minimum of 4 hours

(Additional time will be required for creating *.CNT format files for Windows 95 (or NT) Help)

Search facility (customize, add to, and review) 20-40 hours total
Time to create or modify graphics other than screen captures. This task includes steps like the following:
  • Capturing icons, cleaning them up, and including them in files
  • Creating icons
  • Creating diagrams or other graphics
0.5 hours per graphic object

Guidelines for other tasks and roles

For the other roles on a project, we use these guidelines:

Project leader: 15-20 percent of all other hours

Editor: 6-8 printed pages per hour, or 8-12 help topics per hour

Also remember to estimate time for other tasks for which writers are responsible. Here are some examples:

  • Participating in system design meetings
  • Testing the software (either formally or to make sure what you wrote is accurate)
  • Maintaining issues lists for the software development team
  • Participating in regular status meetings
  • Preparing meeting minutes

Fine-tuning the estimate

After you’ve calculated hours for each line item in your estimate, look at the hours in the context of your project. Your goal is to determine whether the hours you’ve estimated are sufficient for the circumstances. There are two activities you should do to “test drive” your estimate.

First, lay out the hours over a time period so that you can see whether the work can be accomplished by the deadline with available staff. If you find that you need to add staff, figure out what that means for your estimate. Increased project management time for the extra coordination? Learning curve time? More hours allocated for status meetings?

Second, think about what would cause your estimate to be too low. What circumstances are beyond your control, and how likely are they to create problems for you? Such circumstances can be anything from scope changes to unavailable subject matter experts to political battles. You can address these in your estimate in one of two ways:

  • Identify the risk in the list of written assumptions on which your estimate is based. This is appropriate for ensuring that you can renegotiate the estimate if the scope changes, system development dates change, reviewers don’t respond on time, or other circumstances change.
  • Adjust the estimate to accommodate the risk and its implications. For example, if you know that the test system is often unavailable for days at a time, you may need to add hours to the project for the writers to gather and test information in more time-consuming ways. Or let’s say there is a high level of mistrust and animosity between two key managers due to issues regarding system design. You might want to allow for the possibility of another review and revision cycle to iron out conflicting review comments. This cycle could increase total project time by 15 to 20 percent.

You can now present the estimate to your client or manager with a high level of confidence. Your estimate will be credible because it is based on concrete information that defines the scope of the project. And it will be comfortable for you because it accounts for the intangible risks that you’ve anticipated. Finally, your list of assumptions gives you the grounds for renegotiating the estimate if any of those assumptions change.

Should you provide a cost estimate?

Sometimes projects are so undefined that it’s impossible to provide a valid estimate for completing deliverables. However, you can still provide some budgetary information without committing to finish work within a particular budget or time frame. Here are two approaches:

  • Provide a time-and-materials quote based on working at an hourly rate for the duration of the system development project.
  • Provide a sample estimate for a hypothetical deliverable, but work on a time-and-materials basis. The sample estimate at least gives the client or manager some sense of what it takes to develop documentation.

Editor’s note

Although Bill wrote this article in 1999, the techniques he describes in this article remain viable and innovative today, despite changes in technology and software.

Automating Development Tasks for a Large-scale Help System

by Bill Lindsay, Senior Consultant

Although tools like RoboHELP® can speed up and streamline the process of creating help topics, there are still many repetitive tasks needed to build a WinHelp system that supports a large, integrated application.

This article summarizes one of the techniques that Fredrickson Communications used to automate the process of developing online help topics. Once the underlying structure and macros were in place, we were able to generate hundreds of help topics at the rate 15-20 per minute.

Setting the scene

The project was a large manufacturing system comprised of five application components. The online help team consisted of a project leader, several information developers, and an editor.

In addition to window-level help, procedures, and other aids such as toolbar help and a glossary, the online help system needed to provide context-sensitive help for about 750 unique window objects (primarily fields) that were used repeatedly throughout the application components.

The primary source of information for the field help was a client-maintained data dictionary that provided information about these objects such as internal (column) names, business names, data types, descriptions, and some notes regarding usage.

Our objective was twofold:

  • First, we wanted to use as much of the data dictionary information as we could in developing content for the field help.
  • Second, once the content was established, we wanted to create a process for automatically generating the help topics.

Developing the content

Having determined that we could use much of the information in the data dictionary, especially the descriptions, the next task was to define the process that we would use to generate and modify the content.

The major steps were:

  1. We identified the data dictionary records that we could reuse for field descriptions. To accomplish this, the first thing we did was to have the writers use an Access database to document field lists for their applications. From this list, we then ran queries to identify a unique set of column data that we could use from the data dictionary.
    Note: For the first release of the application suite, we used about 435 of 1000 records. For the second release, at which time additional application components were added, we used about 750 of 1100 records.
  2. Next, we copied the data dictionary records to the Access database. This step included changing some column names, translating data types, and appending the notes to the descriptive information.
  3. After the usable information was extracted, the new “Help” table was made available to the writers so that they could modify the existing field information and add further notations to reflect usage considerations appropriate to a particular application or window.

Creating the help topics using macros

At this point, the database Help table contained all of the information necessary to create the field help, such as topic titles, topic IDs, and descriptive text.

The remaining task was to create macros that would actually generate the field help topics for the approximately 750 field objects. From a construction standpoint, although all of the writers contributed to developing the content of the field help, one help developer was responsible for converting the field help information into help topics.

Our solution was a module comprised of three macros.

  • Two of the macros were recordable Word macros; their function was to prepare the Word document and create a topic.
  • The third was a Visual Basic™ macro that looped through the document and created a topic for each set of field help information.

Starting the process
The Visual Basic macro initiates the process:

  1. Before the macro is run, the Help table (containing the updated field information) is copied to the clipboard.
  2. When the module is executed, it starts the Visual Basic macro that controls the overall process.
  3. The first thing that the Visual Basic macro does is execute Macro #1.

Preparing the document
Macro #1 prepares the Word document:

  1. Macro #1 begins by pasting the Help table (from the clipboard) into a Word document.
  2. Next, the macro applies styles to the various titles, labels, and text descriptions. A style called “PopupTitle” is applied to each title that will eventually become a RoboHELP topic title.
  3. The document is then converted from a table to text.
  4. The final step that the macro performs is to remove unnecessary paragraph marks and apply additional spacing as specified.

Creating a RoboHELP topic
Macro #2 adds the footnote symbols and then codes each RoboHELP topic:

  1. Macro #2 starts at the top of the document and creates a topic. It uses the title with a style of “PopupTitle” for the topic title.
  2. Next, it copies the column_name (a vestige from the Help table), codes it as the topic_ID, and then removes it from the text.
  3. Last, other footnotes are added as needed. For example, in our project:
    • In order to have an index with a single point-of-entry for a field listing, we applied the footnote K Fields (list of) to each topic.
    • Similarly, to display a field help in a small popup-like window named “object” when the topic was accessed from within the Help system (rather than as context-sensitive help from the application), we added the footnote > object.

Completing the coding
The Visual Basic macro uses a loop to complete the coding:

  1. Using a loop, the Visual Basic macro runs Macro #2 for each set of field information that begins with the style “PopupTitle.”
  2. To prevent crashes due to memory problems, the macro includes a counter that clears the memory buffer by saving the document each time ten topics are created.
  3. When the last topic is created, the macro saves the document one final time.

Maintaining the field help

From time to time, the data dictionary is, of course, revised. To update the field help part of the help system, we only have to query the data dictionary for new or changed records, copy this information to our Help database, fine-tune the description information as necessary, and then rerun our macros.

Saving some time

For this particular manufacturing project, from the time the Help table was copied to the clipboard to the time that the last topic was coded, it took less than one hour to generate the 750 field help topics using a low-end Pentium computer. Certainly, before you can use a process like the one described here, someone needs to develop (or extract) the topic content, and ensure that the format of the help topics is consistent from one topic to the next. However, once those things are established, then there are savings to be enjoyed as well as some repetitious tasks that can be avoided.

RoboHELP is a registered trademark of Adobe Corporation.
Visual Basic is a trademark of Microsoft Corporation.

Lola Fredrickson

Lola Fredrickson is the founder and CEO of Fredrickson Communications. She has more than 30 years of experience in all aspects of technical communication. She has worked in both the public and private sectors to produce all types of technical communication products, books, and training materials for organizations of all sizes.

Lola has published and presented numerous articles and papers in the area of technical communication. In 1993, she won the International Award of Distinguished Technical Communication from the Society for Technical Communication for the publication of a quality model for the profession.

Lola holds a BS in art and design, and an MS in plant sciences and chemistry. She is an Associate Fellow of the Society for Technical Communication, serves on the advisory council for the technical communications program at Metropolitan State University, and has taught technical writing courses at the Management Center of the University of St. Thomas. Lola is also active in numerous community organizations.

10 Questions to Ask When Selecting an eLearning Vendor

by Lola Fredrickson, CEO, Fredrickson Communications

Selecting a vendor to help you design eLearning content, or redesign an existing eLearning web site, can be an overwhelming task—especially if you don’t have extensive knowledge of current practices and skill sets. To help you establish a framework for comparing and evaluating vendor strengths and weaknesses, we’ve compiled a list of essential questions to ask during the selection process.

1. What skills does the vendor bring to the project? Will they fully complement and supplement the skills that you have in-house?

Tip: Make sure that between your in-house team and the vendor’s team you have strengths in each of the following areas:

  • Content design
  • Instructional design
  • Programming
  • Project management
  • Technical design
  • Usability engineering
  • Visual design

Covering all of these skills at a depth to provide good quality typically requires a minimum of three people.

2. Can the vendor work within my budget? Can they ensure effective interactivity at a low cost?

Tip: This is likely to be one of the most important factors in your selection, so be sure to let vendors know the range of your budget so they can propose something appropriate. Explain your goals and vision to vendors, and ask if they have experience producing effective products within that budget. For example, a company that has produced only high-level, graphically sophisticated training content for large marketing or advertising efforts might have difficulty working within the budgetary constraints of an internal project to create eLearning for a business application.

3. Can the vendor educate you about choices you might make and help you clarify your business goals and performance objectives? Can they help you determine the best solution to meet your needs within your time and budget?

Tip: When you interview vendors, determine if they genuinely listen and if they are able to share recommendations and alternatives. For example, do they clearly present the options available for technical design and visual design? Do they discuss advantages and disadvantages of different methods of creating and presenting information? Do they arrive with a solution they want to sell, or do they first listen to your needs, understand your technical environment, and gather information on your learning objectives and audience?

4. What project management experience does the vendor bring to your project? Do they have an established and refined process to guide their work?

Tip: A complaint sometimes heard about vendors is that they “require a lot of management.” When you hire a vendor, you should be getting something off your plate, not adding something to it. Listen to their comments about project management, and seek reassurance that they can handle what’s necessary while regularly and clearly reporting to you on the project status.

5. Whether your project involves information design or redesign, how will the vendor ensure that the site or content is usable?

Tip: Look for a discussion about usability testing, site assessment, and experiences with usability on other projects. Be wary if the talk turns to costly bells and whistles that may not contribute to the project goals.

6. When addressing learning, how will the vendor handle meeting performance goals?

Tip: Ask vendors if they have instructional designers available for the project if their involvement is required. When discussing learning, listen for whether the vendor talks about performance analysis or gap analysis. If they don’t have that skill area, ask how they develop learning objectives and how they design student practice opportunities.

7. How does the vendor keep up with the training/eLearning field?

Tip: Look for the degree to which vendors say they learn by trial and error (we all do some of that!) versus training, education, and participation in professional organizations. What books, authors, and experts do they rely on?

8. How responsive, accessible, and flexible is the vendor? Do you like the team? Would you want to work with them daily?

Tip: Note whether vendors respond promptly to your messages and emails in an informative and courteous manner. In addition, look for indications that they are willing to collaborate and lead whenever appropriate.

9. How do they design a project?

Tip: Ask vendors to describe their project process. For example, does it follow an established instructional design methodology? Does it link up with your business application development process?

10. Check out references—actually make the calls.

Tip: Ask previous clients about responsiveness, project management, and what surprises they had (good and bad). Remember that on large projects there are always difficulties. Find out how the vendor handled recovery and communication about the issues. Did they make things right for their clients on past projects?

We’ve developed a checklist based on these 10 questions. To print the checklist and use it as you compare eLearning vendors, see this eLearning Vendor Comparison Checklist.

Monique Yenamandra

Monique Yanamandra joined Fredrickson Communications in 2004. She has over 15 years of experience in eLearning, LMS and learning management technology, and web development. Monique earned her BA in History from the University of Minnesota.

Integrating Content in Your Learning Management System

by Monique Yenamandra, Technical Architect

I’ve worked with many clients to integrate their custom and off-the-shelf software into their learning management system (LMS). While some of the challenges become easier with experience, some remain relatively time consuming. I see many opportunities for companies to reduce the feelings of uncertainty about the integration process and the time spent on some of the steps. Here are some of the questions I hear most often and some questions that I think people would ask if they knew who to ask.

Q: I have existing eLearning courseware, some of which was developed before we implemented our LMS. Can I connect these courses to my LMS or do I have to have them completely redone?

A: It is likely that you will be able to integrate your existing courses into your LMS without redoing the courses. One solution that we’ve found to be effective is to apply what we call an “ LMS wrapper” to each course. This LMS wrapper is a transparent layer of code that handles the communication with your LMS, but it doesn’t change the way the course works for your learners.

If you own the source code for your course or can get permission to make minor modifications, you may very well be able to apply a wrapper to your course without touching the course content.

How does an LMS wrapper work? First of all, we can’t talk about LMSs without talking a little about the AICC and SCORM standards. AICC and SCORM are collections of eLearning technical standards that have been widely adopted by the eLearning industry to define, among other things, communication between LMSs and courseware. Most LMSs now are SCORM or AICC compliant.

Although SCORM and AICC have multiple objectives, the benefit that most companies derive from them is a common method for data structure and data communication. Rather than every LMS or content vendor inventing their data model or communication model anew, everyone gets to benefit from well-thought-out, predefined, and agreed-upon models. That’s the theory anyway. More on that later. Even if your LMS is not AICC or SCORM compliant, your LMS has a defined model that courseware can use to communicate with it.

Now for a brief description of the LMS wrapper itself. To apply an LMS wrapper, you need to have the source code for your course and permission to modify it, and the course needs to be developed in a standard web scripting language or program. Typically, we create an LMS wrapper using HTML and JavaScript that conforms to AICC, SCORM, or your LMS’s communication model. Your existing course is then wrapped in this code. Minor modifications to the course code are made where necessary. The LMS wrapper takes care of communicating with the LMS.

This explanation is pretty technical, but the bottom line is: don’t throw away or redevelop eLearning courses because there is no obvious way to hook it up to your LMS. There are options out there like the LMS wrapper that can provide a reasonably fast and cost-effective solution.

That’s the technical solution. How does the whole process look? The steps involved are:

  • Identify your technical resource to confirm the feasibility and develop the wrapper.
  • Identify your company’s LMS administrator or contact person.
  • Identify the upload and testing procedure for loading courses into the LMS. (Hopefully the LMS administrator can help you with this.)
  • Determine the communication protocol that the LMS supports and what you would like to use.
  • Determine what information you require the LMS to track (score, time, or completion status are typical), if your LMS supports tracking this information, and if you will be able to generate reports on that data if necessary.

Q: What is the biggest challenge in getting a course hooked up to an LMS?

A: In my experience, the challenges come in two flavors: procedural and technical.

Procedural challenges: First, there is the challenge of undefined or vague LMS procedures. Along with this problem goes the fact that it is often hard to even identify who the LMS administrator is within a company. It also can be the case that the administrator may not know enough about how the LMS works to clearly define the process for your specific need.

Here I need to digress a little into the key role that the LMS administrator plays in helping a company make the most of its LMS. The LMS administrator, naturally, plays a big part in defining the process for connecting content to the LMS.

However, as important as the role is, the skills required of the LMS administrator are not always well understood. In many cases, the LMS administrator needs to be quickly chosen, either to replace an administrator who’s moving on, or because implementation resources are rolling off the project and now the learning organization needs to take ownership of the new system.

The important thing to understand is that, in general, LMSs are not yet to the point where they can be completely administered by someone who doesn’t have a strong technical aptitude. A non-technical LMS administrator may get by for the most part, but there is a price to pay and you may not even know you’re paying it.

Basically a non-technical person can handle most LMS tasks when things are going well. It’s when you want (or need) to do something slightly different, or when there’s a problem, that an LMS administrator with a strong technical aptitude pays off. Not only are they able to understand the need or issue, they are better able to understand limits, quantify problems, and facilitate solutions.

Another aspect of the integration procedure is identifying the process for testing. The steps involved in testing LMS integration may be unclear. For instance, you may not know the right person to contact to facilitate the testing and troubleshooting. Clearly, issues like these will affect your timeline and your costs, and will make ironing out technical blips more time consuming. This brings us to the technical challenges.

Technical challenges: Consider the case of AICC or SCORM-compliant LMSs. The fact is, although LMSs are mostly compliant with these standards, there are usually technical variances that require massaging the course’s code to get it to work with the LMS. This is all just part of the process of getting things to work, but it can be frustrating, especially when you’re in a hurry to launch a course. If you know who to work with, troubleshooting these inevitable hiccups will go faster. The more technical savvy the LMS administrator has, the more quickly the process goes.

Some companies have a resource who can help you answer some of these questions. Even if yours doesn’t, the good news is that once you’ve done this, the next time goes much more quickly.

Q: Is my LMS honeymoon over? I’m happy with the global tracking and reporting ability of our LMS, but our users are complaining that the LMS is too hard to use. It takes too many clicks to reach a course and it’s not easy to navigate. Is there anything I can do?

A: It may make you feel better, if only briefly, to know that you are not alone.

My take on this is that these first generations of LMS systems failed to take into account the user experience, which after all, does matter. A lot.

An LMS is great for tracking and centralizing data, but overall, users are not interested in learning how to use a tool that exists in order to facilitate the learning that they really care about. The problems I hear about range from overly complex and non-intuitive interfaces to the LMS interface not updating learner completion information.

To make things worse, little of the user’s LMS interface is customizable out of the box. Some clients have gone so far as to develop end runs around their LMS just to provide simple, minimal interaction with the LMS.

This answer may not be much help, but we have heard that new releases of some popular LMSs will respond to some extent to these complaints. When you get to the next upgrade point, or if your company has decided to purchase a new LMS, it is well worth your time to consider the user experience in your decision making.

Jay Kasdan

Jay Kasdan joined Fredrickson Communications in 2004. He has over 15 years of experience designing, developing, and implementing training programs. He also specializes in measuring performance solutions and training effectiveness, and he has developed training assessments for all four levels of Kirkpatrick’s training evaluation model.

While Jay was a training manager for Deluxe Corporation’s American SAP implementation, his group won the prestigious National Impact Award for SAP implementations from the Americas’ SAP Users’ Group (ASUG).

Jay holds an MS in Vocational Education from the University of North Dakota.

Measure the Right Objectives: A Formula for Designing and Evaluating Training Programs at Level 2

by Jay Kasdan, Project/Account Manager

I’m often asked by our clients and the subject matter experts “How do we know if our training is effective?” That’s not an easy one to answer. Measuring training effectiveness has always been and continues to be a challenge.

When we think about measuring training effectiveness, it helps to consider the following questions:

  • Do your training programs meet their objectives?
  • If they do, how do you know the objectives were met?
  • How do you measure concepts or competencies in training?
  • Do your objectives tie to key competencies, measurements, and exercises?

Wouldn’t it be great if you could ensure that you are designing training programs that identify the correct objectives and meet those objectives? You can! It is as easy as determining four components:

  • Competencies
  • Objectives
  • Exercises
  • Evaluation

In this article, I’ll provide you with some basic information about how I use this formula to develop training programs, and how the formula can help provide guidelines for subject matter experts. It’s important to remember, though, that meeting your objectives only matters when they are the correct objectives.

Just to make sure we’re on the same page when it comes to evaluation terminology, I’ll be referring to Kirkpatrick’s Four Levels of Measurement. This model was developed in 1959 by Donald Kirkpatrick, PhD, and is the most widely recognized model for training evaluation. Kirkpatrick’s four levels are:

  • Level 1 – Reaction
  • Level 2 – Learning
  • Level 3 – Behavior
  • Level 4 – Business Results

The current state of training evaluation

If you had trouble answering the questions I posed at the beginning of this article, you’re not alone. Unfortunately, many training programs and training departments fall short of identifying and meeting their objectives. In fact, in many cases, training programs don’t have formal objectives at all. And the training programs that do have objectives often do not determine if they’ve been reached (Level 2 evaluation). Add to this the fact that many organizations are unable to identify the exact results achieved through training, and it’s pretty easy to see why we have confusion when it comes to evaluating training effectiveness.

Chances are also good that the training programs that are measuring at Level 2 aren’t measuring the appropriate learning. According to the ASTD’s 2005 State of the Industry Report, only 54% of the companies surveyed measured training results at Level 2. So even when we don’t consider the quality of the measurement, only half of the companies in the survey are measuring at Level 2 at all.

A simple formula: competencies, objectives, exercises, and evaluation

I advocate using a simple formula to help measure the knowledge gained through training (Level 2). This formula can help training organizations dramatically increase their training results and measure them.

The formula matches competencies, objectives, exercises, and evaluation in an integrated fashion to measure Level 2 training results of the appropriate competencies. Additional information about these four components follows:

  • Competencies – what learners need to learn for their job.
  • Objectives – specific measurable and observable statements of what will be learned in the session.
  • Exercises – activities in the training session that will be used for evaluation purposes.
  • Evaluation – tool used to measure the competencies.

Competencies

The word “competencies” has a variety of different definitions. For the purpose of this model, I define competencies as the tasks or skills that the learner must complete on the job. Competencies should be identified as a first step in developing your training materials.

Obviously, if we can’t identify the competencies or tasks the individual needs to complete on the job, we cannot develop effective training. The identified competencies will be matched to objectives, exercises, and the evaluation.

For example, if we were to write simple statements regarding technical and soft-skill competencies, they might look like this:

  • Technical competency – The inventory control specialist will be able to use SAP to complete the identified system transactions associated with their job.
  • Soft-skill competency – The customer service specialist will be able to answer a customer complaint by following all of the elements of an effective customer service call.

Objectives

I’ve found that not all instructional designers match the training program’s objective to the competencies. By tying objectives to the competencies, the instructional designer:

  • Describes how a learner will demonstrate their knowledge, comprehension, and ability to perform a specific task.
  • Communicates an intended instructional result to learners by conveying a picture of what a successful learner will be able to do.

For anyone not familiar with objectives, Robert Mager’s book, _Preparing Objectives for Programmed Instruction _(1962), remains the standard for writing objectives today. By Mager’s definition, an objective should have three components.

  • Behavior – should be specific and observable
  • Condition – sets the circumstance of the behavior
  • Standard – level of performance that is considered acceptable

Here are some examples of technical and soft-skill objectives:

Technical objective – The participant will be able to use the On-Line Quick Reference (OLQR) to post a goods issue using movement type 201 and cost center 4010591 within 10 minutes.

Soft-skill objectives – The participant will:

  • Get the customer’s name during the first 45 seconds of the phone call.
  • Use the customer’s name when confirming/restating the customer complaint.
  • Confirm/restate the customer’s issue within two minutes.

The Role of exercises

Many training programs provide a post-test on the knowledge gained by the learner. While this is an effective practice for information, it has little benefit when the competency is actual performance rather than just knowledge.

This is why exercises in training courses are so important. Exercises give the learner the opportunity to practice and demonstrate competence at the level of the objective. By developing appropriate exercises, the instructional designer provides the link to the objectives and competencies.

When the task or competency is performed (as most competencies are) the exercise should be in the form of a work-related scenario. Again, using our technical and soft-skill training categories, here are a few examples of different exercises:

  • Technical exercise: You are filling an order for a bank and need to issue two covers (00C 502Z01) for the order. Perform a goods issue for the covers using posting date 1/5/2005, movement type 201, and cost center 4010591. You have 10 minutes to complete the scenario.
  • Soft-skills exercise: You receive a call from a customer complaining about an item on their credit card bill. Complete the first three steps of an effective customer service call within two minutes.

Evaluation

The final part of the formula is the evaluation. The evaluation ties together the competencies, objectives, and exercises with measurement of the performance. The evaluation tool measures the competencies that are defined by the objectives and completed during the exercise. Examples of technical and soft-skill evaluation tools are shown below.

Technical: example evaluation

Inventory movement: Did the participant…
Post a goods issue within 10 minutes? Yes No
Use the OLQR to post a goods issue? Yes No

Soft skill: example evaluation

Customer service call: Did the participant…
Get the customer’s name during the first 45 seconds of the phone call? Yes No
Use the customer’s name when confirming/restating the customer complaint? Yes No
Confirm/restate the issue accurately within two minutes? Yes No

Module development checklist

I use a module development checklist for a self-check and as a tool when I’m coaching subject matter experts or non-training professionals. The module development checklist follows the formula described in this article. The link below is a PDF version of a checklist that I think is especially helpful:

Download a training module development checklist based on this model.

By following a formula that integrates the evaluation with competencies, objectives, and exercises, both trainers and training departments can dramatically improve the success of their training. If you’re like me, you are probably saying, “But how do I know the knowledge and skills transfer to the workplace?” Good question — we’ll discuss behavior on the job (Level 3) in future articles.

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Tech Troubles and Usability Testing

by John Wooden, UX Director

World Usability Day is a good opportunity to offer a friendly reminder of the value of usability testing.

A recent article in USA Today* cited a Harris Interactive study in which about 85% of those polled said they had become so frustrated with the customer support for a technology product that “they ended up swearing, shouting, experiencing chest pains, crying, or smashing things. Slightly more than half said not being able to get a live person on the phone was their greatest frustration.… Seven out of 10 people polled said representatives weren’t trained adequately.”

The easier a product is to use, the less need there is for extensive (and often costly) training, documentation, and customer support—and the more satisfied customers will be. As the USA Today article notes, “The benefits of simple, elegant products extend beyond goodwill from customers. Companies that excel in usability can improve their return on such investment more than 10,000 times, estimates Randolph Bias, a professor at the University of Texas at Austin’s School of Information and co-author of Cost-Justifying Usability.”

The most effective way to learn if a product is easy to use is to conduct usability testing with representative end users. Observing people attempting common tasks with a product while they “think out loud” can be tremendously illuminating. With testing, it’s possible to identify the specific aspects of a product that cause confusion or frustration, and then determine how to correct those problems.

But don’t wait until just before a product is ready for release to test—correcting a problem at that point (or worse, after it’s already been released) can cost 100 times as much as correcting that same problem in the design phase.

Usability evaluation is essential for websites and applications, but the same methodology can be applied to any product or service, including customer support.

Who wouldn’t agree that the technology around us needs to be more usable? We could all do with a little less swearing, shouting, crying, and smashing things—not to mention chest pain.

*Jon Swartz, Technology troubles set off tantrums, tears and tirades, USA Today (11/6/2006)

Jill Stanton

Jill Stanton joined Fredrickson Communications in 1999. She has been in the documentation and training field for 12 years. While at Fredrickson, Jill has served as an information developer, instructional designer, trainer, and project manager.

Jill earned her first BA in English, Theatre, and Secondary Education from Hamline University. She then obtained a BA in Psychology from Augsburg College.

Learn About New Technology, but Don’t Forget the Learner

by Jill Stanton, eLearning Lead

In April, I attended the eLearning Guild’s Annual Gathering in Boston. The conference attracted many of the industry’s top researchers and practitioners, and it provided an opportunity to learn from both authorities and peers. It was exciting to hear about the latest trends and buzz surrounding eLearning, but the main message I heard again and again throughout the conference was, “Don’t forget about the learner.”

There were many opportunities to learn about emerging technologies, including eLearning, podcasting, synchronous eLearning, task-based assessments, blogs, game-based learning, discussion boards, video-based interactive role playing, and others.

Drawn in by session titles like “What’s New in eLearning,” I found much more than simple descriptions of new technologies. Again and again, the leaders in the industry were reminding us to start with sound learner-focused design principles rather than to simply use the latest technology.

My research at the conference focused on virtual classrooms (also known as webinars or webcasts), and I had the opportunity to participate in an all-day workshop on the subject led by Dr. Ruth Clark, president and founder of Clark Training & Consulting.

Dr. Clark provided excellent information about how to apply the results of research on media and learning to design for online classrooms. One of her recommendations was to provide multiple ways for participants to interact, including polls, whiteboards, breakout sessions, and inductive questions. Offering many ways to interact helps to counteract the lack of social presence in a virtual classroom. (Social presence refers to the facial expressions, feedback, and sense of attention in face-to-face communications.) This and other sessions on the topic of virtual classrooms delivered a few key messages:

  • Consider the media you’re using. A virtual classroom should not simply deliver a traditional classroom curriculum online. You need to address the lack of social presence and provide other means to engage learners.
  • Don’t just give presentations! A lecture format isn’t terribly effective in traditional classrooms, but it’s even worse in an online setting that lacks social presence.
  • The most effective virtual classroom sessions are polished and professional. These sessions are scripted, practiced, and produced to capture and maintain the learners’ attention. Most sessions rely heavily on audio, so the audio must be well-prepared to be engaging.

Whether you’re considering training in the virtual classroom or via other emerging technologies, one of the most important steps you can take to ensure the success of training is to find out how to use these technologies to effectively engage and teach learners. Make sure your training always focuses on the learner.

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Why Your Users Don’t Like FAQs

by John Wooden, UX Director

Frequently Asked Questions (FAQs) are everywhere on the web, used on all kinds of sites and in all kinds of contexts. From their humble beginnings in the early days of Internet newsgroups, FAQs have become a standard way of providing end users with important information.

But there’s a problem: users don’t like FAQs, at least not the way they are presented on many web sites. In usability test sessions I have conducted over the years, test participants have repeatedly made the same two complaints about FAQs:

  • “They take too long to scroll through.”
  • “They never have the question I’m looking for.”

And sometimes, testers make a third complaint:

  • “The answers are too long-winded.”

In his article, Are You Ready for the New Ruthless User? J. Hruby described the “new ruthless user” as impatient, laser-focused on their task, and looking for the quickest path to the target information. If you are thinking about adding FAQs to your site, it’s worth remembering this behavior.

Too many FAQs

Just as most users are unlikely to scan more than the first five or six items in a list of search results, they are just as unlikely to scan many more than five or six FAQs. When there are dozens or even hundreds of FAQs, the description “frequently asked” simply isn’t appropriate anymore.

For example, the US Government site for people with Medicare lists 390 FAQs (20 pages of questions), and although these have been categorized, there are 19 different categories to review, some that list 15 or more questions. The user still has to do a lot of work. (Yes, there’s an FAQ search function, but with scoped searches like this it’s easy for users to lose track of the category they are searching.)

Propaganda FAQs

Another common problem with FAQs is when the questions posted are not ones any real end users ever asked. Instead, the questions are presented simply to provide an opportunity for marketing happy talk. For example, real end users don’t ask questions like, “How does ABC Inc manage to deliver on time and under budget so consistently?” Or, “How can I use XYZ software to increase our profits and efficiency?”

These are propaganda FAQs, and users will actively avoid them. In contrast, end user FAQs are usually much more specific and focused on solving a particular problem.

Poorly written FAQs

Wordiness and other forms of sloppy writing afflict many FAQs, where either the question or the answer, or both, is unclear. The following is one example from a Twin Cities business:

Q. Security level?
A. Added security

Q. Cost during Winter?
A. Save on heating and cooling cost

The text may be mercifully short, but the questions aren’t questions and the answers aren’t answers. It isn’t even clear what the subject is. (This site’s design, unconventional scroll bar, and distracting Flash animation don’t help much either.)

FAQ guidelines

So what are the guidelines for presenting better FAQs?

  1. If you must present FAQs, keep the list under 10 questions, preferably no more than five or six. If there are many more questions that are truly frequently asked, divide the questions into categories. Obtain user input into the question categories through a card sort exercise.
  2. Present questions that users really ask, not questions you want users to ask. FAQs should derive from carefully documented customer interactions, such as phone calls, e-mails, and the like. Be sure to echo end user keywords in the questions. Users scan for keywords that match their task — your FAQs need to include those keywords.
  3. Keep the questions and answers concise, but not so concise that they lose all context and clarity. The task of writing FAQs should be given to a technical writer, who will know to use bulleted and numbered lists, active sentences, short paragraphs, and proper grammar.
  4. Allow users to rate answers.
  5. Provide an alternative to users who do not see their question and answer. Make sure contact information is prominent.

Topic indexes

You can avoid FAQs and still help users find the answers to their questions with good topic-based navigation. Topic indexes work well, because users can simply scan for the keyword that is aligned with their task, and then navigate to the relevant topic page.

Search/answer

More site owners are recognizing that users consider search engines to be answer engines. E*Trade, for example, invites users to “Enter Questions or Keywords” in the Search field. Even if E*Trade is simply offering a good keyword search, they are using their search engine as an effective tool for dialogue with site visitors, allowing them to begin by asking a question, and avoiding a lengthy list of FAQs. (Yahoo Answers takes this a step further.)

Final reminder

Be sure to get user feedback on your FAQs, your navigation, and your search through usability testing. This is the best way to learn what your users really want from your site.

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Warm, Warmer, Hot!

by John Wooden, UX Director

Review of Hot Text: Web Writing that Works. Jonathan Price and Lisa Price. Indianapolis, 2002. New Riders. [ISBN: 0-73357-1151-8. 507 pages. $40.00 (softcover)].

Jonathan and Lisa Price’s Hot Text is a fun, engaging, thoroughly informative guide to writing for the web, making it an ideal textbook for teachers, students, and professionals. Like Nick Usborne’s Net Words , Hot Text explains and illustrates how and why writing for the web is different from writing off-line. The Prices’ approach to the subject is broader than Usborne’s, however. Whereas Net Words focuses on writing for e-commerce sites, Hot Text discusses writing for the web in general, though with specific guidance for those practicing different types of writing (such as developing online help).

The book is divided into five sections (which are listed along the top of every page, like a web site breadcrumb):

  1. Net Spirit: Of all the sections, this one is the most amorphous, but perhaps this is to be expected of such a large, amorphous subject. It ranges widely, from the importance of getting to know your audience, to writing honestly and with an attitude, to thinking in terms of XML and “objects” (rather than “documents”), to understanding everything the computer screen does to your words (they look fuzzy, they are unstable, they are three dimensional).
  2. Human Style: For many readers, this section will be the most valuable (it is also the longest). It describes and illustrates what works and what does not in online writing. After each guideline, the authors describe the “audience fit” — in other words, how pertinent the guideline is if site visitors want to “have fun,” “learn,” “act,” “be aware,” or “get close to people” (such as in a discussion group).
  3. Genres – Genres are important because they are all about conventions and audience expectations. In literature, some writers can experiment with genres to achieve different effects, but in a business or learning environment, it is best not to fool around with audience expectations. The Prices describe and illustrate best practices in 10 online genres: help and FAQs, privacy policies, email responses to customers, marketing copy, news releases, news articles, email newsletters, weblogs, webzines, and resumes.
  4. Become a Pro: A relatively short section, this provides advice on working professionally as a web writer and editor, both on staff and freelance.
  5. Backup: An extensive list of related web sites books, and articles, together with an index.

Despite its textbook length, Hot Text makes fast, enjoyable reading. And the Prices invite you to submit comments and questions online, providing a URL at the end of every chapter.

Lola Fredrickson

Lola Fredrickson is the founder and CEO of Fredrickson Communications. She has more than 30 years of experience in all aspects of technical communication. She has worked in both the public and private sectors to produce all types of technical communication products, books, and training materials for organizations of all sizes.

Lola has published and presented numerous articles and papers in the area of technical communication. In 1993, she won the International Award of Distinguished Technical Communication from the Society for Technical Communication for the publication of a quality model for the profession.

Lola holds a BS in art and design, and an MS in plant sciences and chemistry. She is an Associate Fellow of the Society for Technical Communication, serves on the advisory council for the technical communications program at Metropolitan State University, and has taught technical writing courses at the Management Center of the University of St. Thomas. Lola is also active in numerous community organizations.

Six Essential Skills for Delivering Information on the Web

by Lola Fredrickson, CEO, Fredrickson Communications

Knowing the essential skill sets for designing, developing, and implementing websites (including those on intranets and extranets) is a key to successfully resourcing and managing website development projects.

This may not sound like news. But if my experience teaching and consulting is any indication, the responsibility for creating company websites (including intranet and extranet sites) is often placed on just one or two people who can cover just two or three of the essential skills. The result is usually a site that is less effective than it could be.

Good sites are the work of good teams bringing all of the essential skills together to meet well defined objectives. So, what are these skills? Here’s my list:

Technical design: Designing the site architecture for scalability, ease of maintenance, and, increasingly, for transactions, personalization, and integration with one or more databases.

Visual design: Creating an attractive, compelling interface that is also easy to navigate.

Content design: Organizing, structuring, and writing information specifically for the web, and for the audience using the site.

Usability evaluation: Determining how useful and easy to use a website is through heuristic evaluation and user testing.

Instructional design (for eLearning): Organizing and presenting educational content on the web.

Project leadership: Shepherding a project from beginning to end, keeping it on budget and on schedule, and delivering a quality product that meets the defined business and user objectives.

Each of these skill areas requires not only a knowledge of theory but also of the best tools to put theory into practice. And the tools available are becoming more varied and sophisticated all of the time. Some examples include:

  • Web authoring tools (like Adobe® Dreamweaver® and Flash®)
  • Graphics tools (like Adobe® Photoshop®)
  • eLearning courseware and learning management systems (like ToolBook®, Saba®, and Plateau®)
  • Programming languages (like HTML, CSS, JavaScript, PHP, Active Server Pages, Java, and SQL)
  • Best practices for gathering requirements and user preferences, creating and evaluating proposed designs, developing and testing pages, controlling site access, and optimizing site performance.

Given that you can earn advanced degrees or certification in all of the essential skill areas, it is a rare talent who develops expertise in all of them. This is why it is so important to use a team approach on web projects and to delegate tasks appropriately.

Of course, there does not have to be a one-person, one-skill relationship. Two or three people working together can easily provide all six of the skills. The key is recognizing all of the skills that are needed and understanding that they are complex and take time to develop.

I hope this skills checklist serves as a beginning to help you assemble the best team for a highly visible and valuable job.

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Resources

Flash and the User Experience

by John Wooden, UX Director

In Jakob Nielsen’s “Top 10 Web Design Mistakes of 2005,” Flash is number 3, which he regards as “a personal failure” after his usability work with Macromedia, the company that originally produced Flash.

But how does a tool end up on a list of web design mistakes? Does this mean AutoCAD is to blame for bad architecture?

Flash doesn’t hurt people—Flash designers hurt people, at least the ones who should know better. Context and user expectations count for a great deal, too. A Flash ad that takes over half your screen and provides a barely visible Close button will almost certainly cause annoyance. In contrast, a Flash animation that users can launch and interact with in an eLearning course can, if done well, be very enjoyable.

Remembering just a few tips will go a long way in Flash design:


  1. Provide users with control. Allowing users to control what they see and what they hear in a Flash movie helps to create a better user experience, and so controls need to be visible and easy to understand. If you’re using icons to represent controls, consider labeling them so that users do not have to guess which control has which effect. Users should have the ability to play, stop, pause, replay, and skip, as well as to turn sound on and off. (If narration is turned off, equivalent text should display.)

    When controls are visible, users will be less likely to try to use the browser’s Back and Forward buttons. Use of these browser controls is less of a problem with Flash movies that play within a site or application than they are for sites or applications created entirely in Flash. In those environments, users’ expectations have been formed by the HTML page model. But a Flash site or application is really only one page, and so a navigation history of visited pages does not apply.

    The problem is that many users do not know this and will still want to use the Back button because that is what they are accustomed to. In response, some Flash developers have tried to address this issue. For a discussion of a development workaround, read Mike Chambers’ article at the Macromedia Developer Center (see link in left column).

  2. Make navigation options clearly visible. Don’t make users guess whether clicking this image or that odd shape will take them somewhere. Place navigation options prominently and consistently. If you are creating a presentation, consider dividing it into several short scenes or chapters and then labeling them (similar to what you would see in a DVD). Also, provide a progress indicator to tell users how much is left in each scene. Remember, a 5-minute Flash presentation can seem like an eternity.

  3. Use a preloader to inform your audience about the progress of the presentation-loading process. You can use the preloader animation to set the tone of the piece that follows.

  4. Present legible text. Use a font size of at least 10 points—preferably larger—and ensure good contrast between text and background. Dark text on a dark background just doesn’t work. Sharp, clear text has been absent from many Flash presentations, partly because Flash anti-aliases text by default, creating a somewhat blurry look for smaller fonts in particular. Now Macromedia has introduced FlashType in Flash Player 8, a new text-rendering engine that provides developers with more control over fonts. This improved rendering might tempt some developers to try (or continue) using small fonts because they look better than they used to. Resist.
  5. Do not use Flash as a substitute for good content. Users want substance, and they are very quick to move on if they do not find what they are looking for. You need to be sure your Flash movie is serving a purpose for your users, and not just pleasing a particular stakeholder. Before you begin creating a Flash presentation, be sure you know your audience and your goals, and ask if Flash is the best tool for the job.

Finally, make sure you test with sample users , and ask them if you’ve been successful with tips 1-5.

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Resources

Web 2.0 sites

A Web 2.0 Primer

by John Wooden, UX Director

As of this writing, “Web 2.0” delivers 273 million results on Google and has already been the subject of two annual conferences in San Francisco. If asked though, most web users couldn’t define what Web 2.0 is, even if they’ve heard or seen the term before. No surprise—it’s a slippery concept obscured by a certain amount of hype.

The hype, however, doesn’t mean there isn’t something happening that’s worth noticing and trying to understand. Web 2.0 isn’t a thing or a place—it’s an umbrella term to describe rapidly evolving tools and practices that are accelerating various types of decentralization, collaboration, sharing, and social networking. Not all of these tools and practices are completely new—in fact, many of them were used during the Web 1.0 era—but at a certain point in any evolution, a series of incremental changes results in something that is different enough to be noticed and even labeled, and such is the case with Web 2.0.

Interest in Web 2.0 increased following the first conference on the subject in October 2004, organized by O’Reilly Media (best known for its numerous books on topics in information technology). O’Reilly Media VP Dale Dougherty is credited with coming up with the idea to use Web 2.0 as the theme of the conference. Some people have suggested that this concept is or was nothing more than an attempt to market a conference and woo venture capital. Although there is some truth in this, it’s too reductive. So lets look more closely at what Web 2.0 is (or isn’t) and touch on what some of the implications might be for business.

In “What is Web 2.0? Design Patterns and Business Models for the Next Generation of Software” (arguably the most important article written about Web 2.0 so far), Tim O’Reilly explains that his group’s initial brainstorming sessions about Web 2.0 set out to contrast Web 1.0 sites, practices, and models with those of Web 2.0. Following this lead, we’ll begin by contrasting Wikipedia with Britannica Online.

Wikipedia

Founded by Internet entrepreneur Jimmy Wales, Wikipedia is an excellent example of what Tim O’Reilly calls “harnessing collective intelligence,” a trend that is accelerating with Web 2.0. A blend of “wiki” (an application that allows users to contribute and edit content collaboratively) and “encyclopedia,” Wikipedia is described on its home page as a “free encyclopedia that anyone can edit.” It is “written collaboratively by many of its readers. Lots of people are constantly improving Wikipedia, making thousands of changes an hour, all of which are recorded on article histories and recent changes. Inappropriate changes are usually removed quickly.”

Wikipedia now consists of more than 10,000 user-created entries. The obvious risk of this approach is that it allows content to be published that is uneven, inconsistent, inaccurate, or heavily biased. However, thousands of readers help to monitor the quality of information on the site, an approach that has been called “self-healing.” Some entries are even preceded by a warning about “weasel words” that betray a particular bias. Wikipedia has thus created a model that is very different from the more conventional, centralized, top-down, and Web 1.0 Britannica Online. In contrast to Britannica Online, Wikipedia exemplifies many of the defining characteristics of Web 2.0:

  • Decentralization (dispersion or distribution of functions and powers to end users)
  • User collaboration (contributing, monitoring, and editing content)
  • Sharing (in this case, knowledge-sharing)

BitTorrent

Decentralization, collaboration, and (file) sharing are the essence of peer-to-peer systems. The first mainstream peer-to-peer network was Napster, and its well-publicized legal dispute with the record companies involved the decentralized distribution and sharing of digital music files. With Napster, individual users broke the monopoly on the packaging and distribution of popular music (not unlike the way in which Wikipedia challenges the conventional model of the encyclopedia represented by Britannica).

BitTorrent, another peer-to-peer pioneer, allows users to share music, video, software, and games, but unlike Napster—where one user would have to finish downloading an entire file before other users could request it from her—BitTorrent disperses the file-sharing process. Instead of a thousand users all swarming the same server to get the same file, BitTorrent enables all these users to collaborate in the download by dividing files into smaller bits that are then distributed by peers before being assembled again as a complete file. This has the effect of making the most popular files the fastest to download.

eBay

eBay has become another very well-known example of decentralization and user collaboration in which every consumer becomes a potential seller and distributor. eBay has no main storage facilities, apart from the basements, attics, closets, and garages of all the people who have been or might be sellers, and it has no real products of its own. It simply provides a service—a way for buyers and sellers to do business. It makes available, and helps people find, far more products than any one physical store could ever handle (Amazon, Netflix, and Rhapsody are similar in this regard), and in doing so, serves thousands of niches that together add up to something much bigger than would be the case if eBay dealt only in current mass merchandise.

This is related to what Wired Magazine’s Chris Anderson has called “the power of The Long Tail,” and is another thread connecting several Web 2.0 businesses. The Long Tail describes a feature of statistical distributions in which “a high-frequency or high-amplitude population (the head) is followed by a low-frequency or low-amplitude population which gradually ‘tails off.’ In many cases, the infrequent or low-amplitude events—the long tail—can cumulatively outnumber or outweigh the head, such that in aggregate they comprise the majority (Wikipedia).”

Flickr

Like Wikipedia, BitTorrent, and eBay, other standard-bearer sites of Web 2.0 also have a basis in providing services characterized by decentralization, sharing, and collaboration. Flickr, a widely used photo-sharing site that in 2004 began allowing users to upload personal photos into chats and instant messages, started off with a different business model from the Kodak site Ofoto (now called Kodak Gallery), for example.

Whereas Kodak was using Ofoto primarily as a channel for users to submit digital photos for printing (sharing was of secondary importance) Flickr’s model was based primarily on photo sharing. Photo sharing has since become a web phenomenon and is central to the social networking that has become so pervasive in the Web 2.0 world. The best known social networking site now is MySpace, a collection of user photos, profiles, forums, and blogs. As of January 2006, it was the seventh most popular site on the web, with 50 million users.

Flickr, del.icio.us, and folksonomy

In addition to photo sharing, Flickr is known for pioneering the practice of user “tagging” of content. What this means is that instead of shared photos being centrally classified and categorized by a group of information architects paid by Flickr, this content is categorized (tagged with keywords) by users of Flickr. Users can also subscribe to tags. Designers, for example, could subscribe to a tag for “illustrations” to see all activity on that tag, subscribe to the tag of a peer they admire, or both.

The social bookmarking site del.icio.us—where users can maintain and share an online list of their favorite sites, articles, blogs, music, and so on—also enables user tagging, as did Consumating,a now-defunct site dedicated to helping “geeks, nerds, hipsters, and bloggers find dates.”

This decentralized user-categorization of content has been called “folksonomy,” a blending of “folk” and “taxonomy” that means “classification by the people.” Just as Wikipedia and BitTorrent are bottom-up, decentralized forms of sharing, folksonomy is a bottom-up, decentralized form of information architecture that creates an alternative to centrally controlled vocabularies. Debates about the practical value of folksonomies continue, with some perceiving in them the “wisdom of crowds” (to cite The New Yorker columnist James Surowiecki’s book of that name), while others see in them the potential for confusion resulting from inexpert and idiosyncratic classification.

Blogs and RSS

Another decentralized type of sharing and collaboration, blogs are very much part of Web 2.0, and the media term “blogosphere” suggests just how significant blogging has become. Blogs have been around for more than a decade, but only recently have they formed the vast, interrelated networks that allow users to comment on one another’s comments in endless conversations. Tim O’Reilly writes, “If an essential part of Web 2.0 is harnessing collective intelligence, turning the web into a kind of global brain, the blogosphere is the equivalent of constant mental chatter in the forebrain.”

Access to blog content has been greatly increased by the ability to subscribe to “feeds” through RSS (Really Simple Syndication). RSS is based on XML versions of content that can be presented on a range of sites and devices. Podcasts and V-casts have extended this subscription system into a full multimedia distribution system, especially useful for content that might otherwise be marginal.

Blogs, along with the use of digital video cameras and camera phones, have been an important part of the trend toward “citizen journalism,” yet another decentralized, bottom-up phenomenon.

Mashups and open APIs

The original context of “mashup” is popular music, where it refers to a digital mashing up or mixing together of sometimes very different songs to create something new. DJs (real and aspiring) used the large number of digital music files available on sites like Napster to create their own mashups. By analogy, a mashup now also refers to a “website or web application that seamlessly combines content from more than one source into an integrated experience” (Wikipedia).

Perhaps the best known examples of this type of mashup are Google Maps and Google Earth, where developers have been able to take advantage of the open APIs (Application Programming Interfaces) that Google provides to add layers of information to an original map or satellite view. So, for example, someone can add comments about a village market in Mali or information about a restaurant in Minneapolis. Feed readers (such as FeedDemon), which gather together various user-selected feeds from news sites, blogs, and so on, are another type of mashup.

One of the reasons that companies like Google and Flickr are providing open APIs is to enable third parties to add features that drive traffic to their sites. Developers who aren’t on the company payroll end up adding features much faster than Google or Flickr developers could on their own. But the third-party contributors benefit too, by adding information or features that they want others to see and use.

Web apps and Ajax

Another indicator of the shift from Web 1.0 to Web 2.0 is a shift to the web as the platform for more and more applications. Web applications for inventory tracking, time reporting, sales planning, training, project management, and word processing (such as Writely) are all becoming increasingly common. The advantages of web apps over desktop apps are many. To note a few:

  • Ease and speed of maintenance and updates
  • Anytime, anywhere access
  • Ease of distribution
  • Larger audiences
  • Possibility for greater aesthetic appeal in a familiar web interface

However, web applications have often lacked responsiveness and functional depth in comparison with desktop applications. For example, most of us are familiar with the inconvenient pause required during data input while a web app saves data to a central server (during which time we watch the hourglass). But a programming approach labeled “Ajax” by technology designer Jesse James Garrett in early 2005, provides a way to solve this problem. The name Ajax—an initialism for Asynchronous JavaScript and XML—quickly caught on and has focused attention on expanding the capabilities of web-based applications.

Google applications are probably the best known exemplars of Ajax. For example, Google Maps and Google Earth allow users to click and drag a map or satellite image rapidly to the north, south, east, or west with virtually no pause for the page to refresh after each motion. Google Suggest, still in Beta, responds immediately to the first few characters a user types, providing a list of possible related search terms. Gmail, Google’s email application, is much faster than other web-based mail systems, in which messages and lists have to be downloaded again each time a user displays a new web page.

Garrett explains how Ajax works:

An Ajax application eliminates the start-stop-start-stop nature of interaction on the Web by introducing an intermediary—an Ajax engine—between the user and the server. Instead of loading a web page, at the start of the session, the browser loads an Ajax engine—written in JavaScript and usually tucked away in a hidden frame. This engine is responsible for both rendering the interface the user sees and communicating with the server on the user’s behalf. The Ajax engine allows the user’s interaction with the application to happen asynchronously—independent of communication with the server. So the user is never staring at a blank browser window and an hourglass icon, waiting around for the server to do something. Every user action that normally would generate an HTTP request takes the form of a JavaScript call to the Ajax engine instead. Any response to a user action that doesn’t require a trip back to the server—such as simple data validation, editing data in memory, and even some navigation—the engine handles on its own. If the engine needs something from the server in order to respond—if it’s submitting data for processing, loading additional interface code, or retrieving new data—the engine makes those requests asynchronously, usually using XML, without stalling a user’s interaction with the application.

Although some people have suggested Web 2.0 is Ajax, or vice versa, it’s probably more accurate to understand Ajax simply as a programming approach that can create more possibilities for web apps. As the web becomes a platform for more applications, this in turn creates a challenge to the desktop model, especially as companies like Google provide open APIs, allowing others to add content and create hybrid applications. As Tim O’Reilly states, “Any Web 2.0 vendor that seeks to lock in its application gains by controlling the platform will, by definition, no longer be playing to the strengths of the platform. This is not to say that there are not opportunities for lock-in and competitive advantage, but we believe they are not to be found via control over software APIs and protocols. There is a new game afoot. The companies that succeed in the Web 2.0 era will be those that understand the rules of that game, rather than trying to go back to the rules of the PC software era.”

Open-source tools

Ajax isn’t proprietary technology. Similarly, many of the tools that have been used to run the sites and applications commonly identified as examples of Web 2.0 are open source, including:

  • Apache—a Unix-based HTTP server
  • Linux—a Unix-based operating system
  • MySQL—a relational database management system that uses Structured Query Language
  • Perl—a commonly-used language for programming web applications
  • PHP—a server-side scripting language designed to process text (often used to process complex web forms)
  • Ruby on Rails—a web application framework that allows developers to create new applications quickly, while implementing effective user interfaces.

The open source movement is motivated by the idea that source code should be made freely available to anyone who wants to use it, and it has had the effect of unleashing the collective talent of thousands of programmers to improve and build on existing code. SourceForge.net is a fascinating example of the open source community in action—the world’s largest open-source software development website, hosting more than 100,000 projects and over 1,000,000 registered users.

Web standards, interoperability, and mobility

In addition to open source software and scripting languages, Web 2.0 sites and applications are being developed using web standards: “CSS for layout, XML for data, XHTML for markup, JavaScript and the DOM [Document Object Model] for behavior” (Zeldman). Because these sites and applications have been built with web standards, thereby separating presentation and content, they are accessible to a greater variety of people and devices (such as handhelds). This interoperability and mobility is another notable characteristic of Web 2.0. This is also another form of decentralization, because content is no longer “centered” or based in just one place. As McManus and Porter point out, “Designers have to start thinking about how to brand content as well as sites. It means designers have to get comfortable with Web services and think beyond presentation of place to APIs and syndication. In short, it means designers need to become more like programmers.”

Where does this road lead?

The short answer is that no one has a map. For now, Web 1.0 sites and applications far outnumber 2.0 sites and apps, and they will for some time. Although Web 1.0 opened up a new channel for organizations to communicate and sell, most sponsors of Web 1.0 sites were not especially concerned with creating a dialogue with users, or opening up possibilities for user sharing and collaboration; they just wanted a “presence” on the web. Most of these sites are digital brochures, though perhaps more informal and interactive than their paper counterparts.

Nothing is inherently wrong with this, especially if an organization does a good job of organizing and presenting the information it has. But as more organizations embrace the principles of Web 2.0, this approach may begin to seem old-fashioned in comparison with sites that facilitate dynamic forms of participation and collaboration that result in new ideas and new creations. More significant perhaps, organizations with Web 1.0 sites may miss out on opportunities to engage clients and prospects and create new business.

Even if some of the vanguard 2.0 sites and apps don’t last, decentralization, sharing, collaboration, and social networking will continue and take on new forms. This isn’t to say that people will no longer look for certain types of authoritative, top-down information, but it does mean that more managers, developers, designers, and writers will have to figure out how to navigate the Web 2.0 world to accomplish their goals, creating their own roadmaps as they go.

For now at least, consider a few questions about your own sites and apps to take a reading on your position.

  • Are you using web standards to write lean code that is accessible to different types of users and devices?
  • Are you taking advantage of open-source software and scripting languages?
  • Are you migrating applications to the web?
  • Are you evaluating whether your web content really matters? Is it content others would value and want to share? Is it easy to find? Are you thinking about how to brand your content? Are you adding, or considering adding, RSS to your sites (internal and external)?
  • Are you implementing, or thinking about, ways to make your site more dynamic by inviting user participation, sharing, and collaboration? Are you taking advantage of the knowledge and talent that your users (internal and external) can provide in a way that serves their needs as well as your own?
  • Are your programmers, designers, writers, and usability analysts working more closely together to create user experiences and content that serve the needs of customers and prospects in different niches and bring them back for repeat visits?

Things are moving fast—here’s looking forward to Web 3.0.

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Resources

25 Tips for Better Web Writing

by John Wooden, UX Director

Drawing on the recent work of Jonathan and Lisa Price and Nick Usborne, the research of usability experts such as Jakob Nielsen, and our own experience, we have compiled this list of 25 tips for writing online. This is the bread and butter stuff that everyone writing for web sites and e-newsletters should know.

Brevity

  1. Cut any paper-based text by 50%. But don’t cut so much that your words lose all meaning and personality. And don’t cut so much that your words become ambiguous.
  2. Make each paragraph short. If possible, keep paragraphs to two or three lines.
  3. Delete marketing happy talk and hyperbole. Be direct, honest, and sincere.
  4. Keep to the main point. If information is not relevant, delete it. If it is important, but not directly related, move it elsewhere and link to it. Also consider "sidebars."
  5. Write in the active voice.

Scannability

  1. Keep each paragraph to one main idea.
  2. Put your conclusion or lead idea in the first paragraph of the article.
  3. Avoid colons, semi-colons, and apostrophes. (They are hard to spot on screen.)
  4. Use tables, charts, or graphs to present repeating information.
  5. Turn any series into a bulleted or numbered list.
  6. Use titles that are clear enough to identify the contents of the page. Make sure that menu labels match page titles.
  7. Use meaningful subheadings to help visitors scan pages.
  8. Use bold to highlight what is important.

Effective links

  1. Place links at the ends of sentences wherever possible, rather than in the middle of sentences.
  2. Provide clues so visitors know what they will get when they click a link.
  3. Avoid using "click here" and "click to." Don’t point to your links. "Shift the focus from the links to the subject" (Price and Price).
  4. Link to external sites when relevant.
  5. Point to what’s new with special links.

Good manners

  1. Write clear, memorable URLs. Try to keep your URL short and predictable, and avoid special characters.
  2. If you redesign your site, make sure that you set up redirects on your server so people who click bookmarked links can get to the new pages.
  3. Tell visitors how large a media object is before they start downloading it.
  4. To optimize searching, use keywords in your page titles and body copy, and use the meta keyword tag. Add a page description in the meta tag.
  5. Write alternate text for images.
  6. Confirm a visitor’s location by showing the position of the page they are viewing in the overall hierarchy.
  7. Write each menu so it offers a meaningful structure.
  • This site occasionally provides links to websites operated by other parties. These links are provided for your convenience only. The presence of a link does not imply any endorsement of the material on the websites or any association with the website's operators. We do not operate, control, or endorse any information, products, or services provided by third parties through the Internet. We are not responsible for the content and performance of these sites. Use of linked sites is strictly at your own risk including any risks associated with destructive viruses.