There are lots of versions of corporate universities. I actually worked with one client (Amtrak) who’s senior leadership decided they needed to have an Amtrak University. The senior training leader–when it was clear she couldn’t fight this decision–simply relabeled the existing training and development resources a “University” and set up a “campus” at an existing training site in Wilmington, Delaware. Nothing else changed (in terms of content offerings, structure, focus) but hey….they were a University now.
I’m sure not all organizations that adopt a “university model” follow that approach. But I”m skeptical of the value of adopting a Corporate University approach to learning, development and performance within organizations. And I’m not the only one who holds this belief. The esteemed Ruth Colvin Clark has noted some similar issues around a move to corporate universities.
I think there is a tendency to assume that a “university” has more prestige and functions at a higher level than a training department–and that this is therefore a good thing. We (meaning people in general) often view the term “university” positively and assume that the work is more rigorous (rather than the converse of referring to your organization’s L&D shop as a “kindergarten”).
But this focus on making or branding an organization’s learning and development shop as a “university” seems to me to be mis-guided. First, it places the emphasis on education versus performance. The primary reason for learning and development in most organizations should be to improve performance. That’s why most training evaluation measures (looking at reaction to the training or even if learning took place) don’t seem very relevant to me. I can enjoy the training or even learn a lot yet fail to get better at my job. When the focus is on learning rather than performance, it’s too easy for learning/training professionals to be unaccountable for results (“don’t blame me why results didn’t get better–the participants enjoyed the class!”).
Additionally, I’m not sure Universities provide a great model to help guide training and development functions. While there are plenty of great examples of innovate learning approaches with higher education, most educators would say that the majority of universities still operate with very traditional models and approaches to teaching and the organization of knowledge.
Plus, the way that a number of organizations have treated the creation of a corporate university was with the centralization of the learning function (to create a “campus”). The irony with this approach is that one of the better examples of innovation with many schools of high learning has been the decentralization of learning–moving out to the field, off the campus, away from a central-visible school.
I would argue that many organizations who’ve adopted a university model have done so either to “keep up with the Joneses” (i.e. seeing it as a trend they need to follow) or as a way to enhance the prestige of the training department. I think a far better way to enhance prestige of L&D is to demonstrate a strong track record for being focused on and effectively building performance.
Customers like to kvetch a lot and so it’s easy to complain about missing the “good old days” when a lot of times the old days weren’t so good with no vaccine for polio, 1 in 2 children dying before the age of 10, maybe a world war going on with millions dying, being born lower class where your chances of going to college were nonexistent or living in a time where there was no such thing as an iPod. Living in the past wasn’t always better.
But lots of people (me being one) feel that overall service performance is getting worse. That’s not just generational narcissism or curmudgeonly attitudes that come from an aging group of baby boomers. I do a lot of client work around service issues and customers experience and that’s my take. And Bloomberg and JD Power collect data on overall service and that’s their take too–overall service performance is getting worse. Oh, there are exceptions–firms that continue to raise the bar. But overall, most firms seem to be doing a worse job serving customers and creating distinctive experiences that provide a competitive edge. How is that so when so many firms pay lipservice and actually spend a lot of bucks on supposedly improving service.
As the content (and garbage) on the web continues to proliferate, it’s sometimes hard to sort through the gold nuggets from the chaff (or the garbage). This is especially true in the performance arena. There is one particular site that has been up a while (“a while” in this case means since 1995). It’s the work of consultant Don Clark. Don has produced a true labor of love that everyone in the workplace learning and performance fields needs to be aware of. With a background in the Army and then Starbucks before he set off on his own, Don decided to create a site not to promote himself but really cover a wide range of ISD, training, OD, performance, management and programmed learning content. He’s got a variety of self-created templates, forms and manuals you can download on topics like ISD or task analysis. He provides a list of HRD names and why they matter, books that are important, timelines for particular topics, relevant quotes and more. But mostly the “more” is about tools and examples and content around how to do what it is that we do–more intelligently and effectively. And the site is clearly designed to share knowledge, not for self-promotion or profit. Frankly, I cannot think of a single person in the workplace learning and performance field who has been so prolific on their website in terms of content.
With some labeling the BP oil spill in the Gulf of Mexico as the worst environmental disaster the USA has ever experienced, it’s worth looking at what we know so far about efforts to deal with the spill for performance improvement lessons. As I look at what I’ve heard about this disaster, several critical lessons come to my mind.
Ignore process at your own peril. There has been such an emphasis on “action” and “leadership” (both by private and public sector organizations) that we’ve seen lots of money, people and activity–but often at cross-purposes. Throwing money and resources at any problem is usually ineffective when there is no clear alignment around the process connecting all of the specific tasks.
It’s a lot easier to prevent a problem than to fix a mistake. The Gulf Oil spill illustrates this point so well–far better and easier to prevent the rig blowout than to clean up tar balls from beaches and try to bathe birds. Continue reading →
When Apple solidified its stance regarding Flash on the iOS platform (or, the lack thereof), an up-and-coming web standard was suddenly cast into the spotlight. HTML5 was a new open source and standardized version of the HTML standard (HyperText Markup Language; the basis for all modern web browsing) which had been in development since mid-2004, with the first tentative release in 2007. Incorporating features of HTML 4.01 and XHTML 1.1, the previous mid-life additions to the HTML standard, as well as features of Adobe’s Flash and Microsoft’s Silverlight.
Notable additions were drag-and-drop site interaction (very common on the web, thanks to XHTML, developed throughout the mid-2000s) and, more significantly, audio and video playback. Instead of requiring a 3rd party plugin, such as Flash or Silverlight, or a 3rd party playback codec, such as Quicktime or Windows Media Player, HTML5 could play properly encoded audio and video straight from the browser. This significantly simplified the prospective future landscape of media on the web. Instead of being dependent on the development pace of Adobe or Microsoft, web developers were freed to contract their own web plugins taking advantage of the new standards. Continue reading →
I’ve been doing work on strategic and strategic planning with a number of different clients lately and it’s gotten me thinking about the issue of blindspots. There are things that we know to be true (or we suspect them to be so). I don’t mean dogma or blind faith, but rather through data, research, experience, customer feedback, measuring performance—there are some things that we can confidently say “this is something that we know to be true or accurate.”
Then we have areas that we know we don’t know. For instance, I know that I’m pretty uninformed about the tax code. Because of my awareness of my ignorance, I can make smarter decisions about taxes—by hiring an accountant. Or being especially careful when I fill out my taxes each year.
The reality is that no person or organization can know everything. So ignorance about particular topics or situations is a reality of being in the world.
But a blindspot occurs when a person or organization is ignorant about a situation and doesn’t realize the ignorance exists. It may be due to dogma. It may be because the situation has changed—what used to be true no longer is but people haven’t recognized that. It may be due to a lack of depth—someone doesn’t realize the degree of complexity to a particular issue. In short, a blindspot is a case where we don’t know that we don’t know something.
Blindspots are particularly damaging to organizations. That’s because most big surprises (especially environmental or market ones) to organizations tend to occur because of a collection blindspot that meant the organization and executives simply failed to perceive the potential for surprise with that specific issue.
You’ve probably all heard of the phrase “what gets measured gets done” and certainly organizations are paying increasing lip service to the concept of measuring performance more. This post is not an argument for not measuring. It’s a lesson about the importance of measuring the right things.
A number of years ago, I was called in to help a call center improve their performance. This call center was a 1-800 “help” provider—you called them when a particular appliance stopped working and you needed immediate help or troubleshooting (from simple steps to fix the problem to where to take it to get repaired to what your warranty did and did not cover). Thus, when customers called this center, it was almost always because something was broken—and often with catastrophic consequences.
The call center management team specifically asked me to find ways to reduce the amount of “hold time” that individuals had to wait before getting an associate to help them online and also reduce the average length of the calls (with the theory being that shorter calls would also means less wait time). And, as a “ps” the management team asked me to also take a look at a call center associate named Martha. Martha, they said, was a really sweet person but if she didn’t turn things around, they would have to fire her. Specifically, they said she was too informal with callers (often times not referring to them as “Mister” or “Ms”). And her average call length was longer than the majority of other associates in the call center. Now it’s worth noting that the vast majority of call centers do measure things like….average wait time and call length and whether or not associates follow the script—that’s pretty standard for the industry. Continue reading →
Intellectually, everyone gets the value of performance appraisals. Yet every client I’ve ever encountered usually bemoans the process and most employees criticize the appraisals. Why is something that should have so much value end up being so belittled?
Organizations do lots of things wrong when it comes to reviews. There is a tendency to spring the final evals on employees as a surprise. I have lost count of the number of people who told me that they came out of their appraisal session in shock—having heard things they didn’t expect. One basic rule of the formal appraisal is that nothing in that session should come as a surprise to the employee—it’s just a formal meeting to review and sign-off on informal coaching and counseling that went on earlier during the year. Another issue is the tendency for managers to put off appraisals until the last possible moment. There are lots of reasons this happens. In some cases, it’s about avoiding unpleasantness or confrontation. In others, it’s because it’s a hassle to do the appraisal paperwork and prepare for it—often because the criteria are so subjective. Continue reading →
Anyone who is familiar with my work or my publications knows that job aids are near and dear to my heart. My third book (Job Aid Basics) is about the subject. Any serious performance student or consultant knows about the power of job aids—about how they are a cheap and effective way of improving performance. Well, there is a great new book out by Surgeon Atul Gawande called The Checklist Manifesto.
Checklists are just one example of a job aid. What is a job aid? A job aid is a device or tool used to improve memory or confidence on the job and thus overall performance. A wrench is not a job aid (it’s just a tool). But a checklist (which reminds us of what to do), a recipe with steps (so we don’t add the eggs too soon), a trouble-shooting guide on how to figure out why the car doesn’t start—these are all job aids.
Gawande writes about a number of examples in this great book but his first primary examples involves healthcare. He examines the case of the Johns Hopkins ICU where using a simple 5 bullet checklist, the staff reduced central line infections from 11% to 0% saving an estimated 43 infections, 8 lives and 2 million dollars per year. Gawande and a team then went to a number of hospitals around the world and tried the same approach from rural Tanzania to Seattle. Using a 19-point checklist for surgery, they found that EVERY hospital experienced a significant drop in post-operative complications and deaths. In the 6 months after the checklist was introduced complications fell by an average of 36% and deaths fell by an average of 47%. This was no new technology, no other major changes or influx of talent or resources—just the use of the checklist during surgery.
Performance consultants know about job aids. Joe Harless gets credit for having coined the term. Job aids are often a faster, cheaper alternative to training. They’re an underutilized way of improving performance and a useful tool in the performance consultant’s tool box.
Gawande has done us performance consultants a tremendous favor. He’s got a significant following (staff in the Obama White House look at his writings, both this book and his previous one Better about improving performance). Dr. Gawande has provided very specific, tangible and quantifiable examples about how performance can be radically improved with even just simple tools or approaches. For all the clients out there who want to throw training at the problem or rehire a work force or change the bonus structure, Gawande’s work is a useful tool to help us make the case for a performance-based approach to improvement.
When I was initially starting out as a performance consultant, clients used to ask what that title meant—what is a performance consultant? And I’d stumble into a definition of what human performance improvement is and what distinguishes it from other approaches only to discover that after about the second sentence my client’s eyes had usually glazed over. Typically we, as performance consultants do a lousy job trying to explain to clients what it is we do and why it works. And the biggest reason why this happens repeatedly is that we fail to see (or hear) things from a client’s perspective.
An accurate definition of HPT or HPI may be fine and good but frankly, most clients don’t care about the academics or the theory. Their focus is more likely to be on: “what can you do for me?” Now if a client wants to know how my approach differs from that of someone in another field, I’m more than happy to provide a performance consulting model or explain particular aspects of the process. But now, when talking with clients, my explanation usually is about the payoff to the clients—the business result. Most of the time I tell clients (especially executives) that I’m a “business consultant.” Because, frankly, the process I use (performance consulting) is of secondary interest to my clients—what they want are results. Continue reading →