Blindspots Revisited

Some of you may recall a previous blog post I did on Blindspots (“Understanding Blindspots”).  A quick refresher about that concept before I take another crack about that topic—we have areas of ignorance—things we don’t know but usually we’re it’s a weakness or deficiency.  For instance, I know nothing about horse riding or dressage–I’m aware that is an area of ignorance for me.  Then we have blindspots—areas we not only don’t know about but we don’t know that we don’t know.  In other words, blindspots are particularly dangerous because unlike an area of ignorance (where we might tread lightly or avoid because we know it’s a weakness or we’re cautious), blindspots typically involve overconfidence.  Individuals can have blindspots and organizations can as well—in fact, most examples of military or intelligence failures involve blindspots. كيف تربح المال من الالعاب

I wanted to revisit this topic because I’ve been working with two recent clients on their strategy, plans and high-level goals.  One client is in the US intelligence community and another is in the private sector (plus plays in the national security space).  A key part of both pieces of work has involved identifying the collective blindspots within each organization.   While I’ve done work like this plenty of times before in my career, it’s always fascinating to see what emerges as a blindspot within the client organization.

Both clients have bought into the value of identifying what their blindspots are.  Only one of the two though has really committed to any action to then deal with those blindspots (other than going “yep—that’s spot on!” and then ignoring it).  At least by publicizing it and talking about it, we have a chance of mitigating a little bit of the blindspot—just turning it into an area of ignorance—perhaps!

How do you spot blindspots?  There are a number of techniques.  One is too look at what doesn’t get talked about in the organization or what isn’t funded. كيف تلعب بينجو  While that’s not a full-proof way of identifying a blindspot (sometimes something doesn’t get talked about because it isn’t important!),  it’s a good starting point.  Another is to look what blindspots the organization had in the past and then test to see if those conditions have changed.  A third approach is to identify critical assumptions the organization or leadership is making.  Assumptions aren’t bad—we have to make them all the time.  But most people make assumptions and aren’t aware we’re doing so. لعبة الروليت في العراق  It’s either unconscious or we consider them to be “facts.”  A fourth approach is to identify the mental models that the leaders and organization share (mental models and assessing them is a topic for another blog post!).  Degree of confidence on particular issues is also a clue as to potential blindspots—issues that an organization has had success with in the past and is confident that “we have this nailed” often forecast a cockiness and a failure to look for disconfirming information. Finally, organizational culture (if there is a strong, cohesive, dominant culture within the organization—and usually there isn’t, usually it’s a series of subcultures) can be a clue about blindspots.

Gulf Oil and Performance Lessons

With some labeling the BP oil spill in the Gulf of Mexico as the worst environmental disaster the USA has ever experienced, it’s worth looking at what we know so far about efforts to deal with the spill for performance improvement lessons.  As I look at what I’ve heard about this disaster, several critical lessons come to my mind.

  1. Ignore process at your own peril.  There has been such an emphasis on “action” and “leadership” (both by private and public sector organizations) that we’ve seen lots of money, people and activity–but often at cross-purposes. الرهان في سباق الخيل  Throwing money and resources at any problem is usually ineffective when there is no clear alignment around the process connecting all of the specific tasks.
  2. It’s a lot easier to prevent a problem than to fix a mistake.  The Gulf Oil spill illustrates this point so well–far better and easier to prevent the rig blowout than to clean up tar balls from beaches and try to bathe birds.
  3. Being clear about the desired outcome is critical. العاب اون لاين مجانا  Those of you knowledgeable about performance improvement know how critical outcomes are as a means of providing direction.   Unfortunately, everyone assumed there was a clear purpose (clean up the spill) when actually there was tremendous disagreement among directions.  Some groups argued for booms to corral the oil (which doesn’t address oil beneath the surface).  Others argued for strong use of chemicals to eat the oil or break it down (which was opposed by others who felt this could produce worse environmental impacts than the oil itself).  The disagreements were more than just differences on tactics but instead reflected major (and often incompatible) directions.
  4. Data matters.  Throughout the first month of the disaster, there was a consistent inability to answer some of the most basic questions like:  approximately how much oil is escaping daily, what backup or contingency plans are reasonable if the first cap fails, what are the environmental impact of the oil dispersants being used, and what percentage of the oil is remaining beneath the surface?  Without some kind of data, policy decisions were being made on the basis of educated guesses and anecdotes. بينجو لعبة

What other performance insights have you gotten from this mess?

Understanding Blindspots

I’ve been doing work on strategic and strategic planning with a number of different clients lately and it’s gotten me thinking about the issue of blindspots. There are things that we know to be true (or we suspect them to be so). I don’t mean dogma or blind faith, but rather through data, research, experience, customer feedback, measuring performance—there are some things that we can confidently say “this is something that we know to be true or accurate.”

Then we have areas that we know we don’t know. For instance, I know that I’m pretty uninformed about the tax code. Because of my awareness of my ignorance, I can make smarter decisions about taxes—by hiring an accountant. Or being especially careful when I fill out my taxes each year. bwin

The reality is that no person or organization can know everything. So ignorance about particular topics or situations is a reality of being in the world.

But a blindspot occurs when a person or organization is ignorant about a situation and doesn’t realize the ignorance exists. موقع مراهنات 365 It may be due to dogma. It may be because the situation has changed—what used to be true no longer is but people haven’t recognized that. It may be due to a lack of depth—someone doesn’t realize the degree of complexity to a particular issue. موقع المراهنات على المباريات In short, a blindspot is a case where we don’t know that we don’t know something.

Blindspots are particularly damaging to organizations. That’s because most big surprises (especially environmental or market ones) to organizations tend to occur because of a collection blindspot that meant the organization and executives simply failed to perceive the potential for surprise with that specific issue.

Are You Measuring the Right Thing?

You’ve probably all heard of the phrase “what gets measured gets done” and certainly organizations are paying increasing lip service to the concept of measuring performance more. This post is not an argument for not measuring. It’s a lesson about the importance of measuring the right things. شرح لعبة الروليت

A number of years ago, I was called in to help a call center improve their performance. This call center was a 1-800 “help” provider—you called them when a particular appliance stopped working and you needed immediate help or troubleshooting (from simple steps to fix the problem to where to take it to get repaired to what your warranty did and did not cover). Thus, when customers called this center, it was almost always because something was broken—and often with catastrophic consequences.

The call center management team specifically asked me to find ways to reduce the amount of “hold time” that individuals had to wait before getting an associate to help them online and also reduce the average length of the calls (with the theory being that shorter calls would also means less wait time). And, as a “ps” the management team asked me to also take a look at a call center associate named Martha. Martha, they said, was a really sweet person but if she didn’t turn things around, they would have to fire her. Specifically, they said she was too informal with callers (often times not referring to them as “Mister” or “Ms”). كيف تربح في الروليت And her average call length was longer than the majority of other associates in the call center. Now it’s worth noting that the vast majority of call centers do measure things like….average wait time and call length and whether or not associates follow the script—that’s pretty standard for the industry. Continue reading “Are You Measuring the Right Thing?”

Improving Performance Doesn’t Mean You Do Performance Improvement

Okay, I’ve got a pet peeve—something that really pushes my buttons.  The data from a host of sources has continually shown that organizations and executives are placing more emphasis on “performance.”  Leave aside the reality that many of them (organizations and execs) don’t really know what performance is in this case (below the organization level of profits or sales or end results). استراتيجية الروليت  But almost everyone in the HR field therefore knows there is more emphasis on “performance.”

So part of what we see is for people (internally as well as external consultants) to tack the word “performance” on to what they do.   We see “performance-based training” or “performance-enhancing facilitation” or “performance-driven HR” or some other variation.  To me, this reveals a fundamental misunderstanding of the performance improvement field. العب كازينو Continue reading “Improving Performance Doesn’t Mean You Do Performance Improvement”

Performance – And Performance Appraisals

Intellectually, everyone gets the value of performance appraisals.  Yet every client I’ve ever encountered usually bemoans the process and most employees criticize the appraisals.  Why is something that should have so much value end up being so belittled?

Organizations do lots of things wrong when it comes to reviews.  There is a tendency to spring the final evals on employees as a surprise.  I have lost count of the number of people who told me that they came out of their appraisal session in shock—having heard things they didn’t expect.  One basic rule of the formal appraisal is that nothing in that session should come as a surprise to the employee—it’s just a formal meeting to review and sign-off on informal coaching and counseling that went on earlier during the year.  Another issue is the tendency for managers to put off appraisals until the last possible moment.  There are lots of reasons this happens.  In some cases, it’s about avoiding unpleasantness or confrontation.  In others, it’s because it’s a hassle to do the appraisal paperwork and prepare for it—often because the criteria are so subjective. تعليم لعبة بوكر Continue reading “Performance – And Performance Appraisals”

Data and Information

Performance consultants believe strongly in evaluation–in checking to see if our work has made a difference. لعب البوكر على الانترنت And to evaluate means to collect data.

Lots of people collect data. The problem is that just because you can collect that data doesn’t mean it’w worth collecting. I run into clients all the time to ask me to gather particular information or are so proud about the data they do collect. A classic case consists of some of the training data that many learning and development departments collect annually. موقع المراهنات على المباريات For instance, it’s not uncommon for a lot of training shops to gather and then aggregate such information as: the total average score (from a 1-5 Likert scale) of all training classes, or total number of employees who attended training. Think about those two measures for just a second. If the average evaluation score for all classes goes up or down, does that prove anything? First, if people like (or like less) a particular course, that doesn’t mean it was an effective training program. Second, if the overall score goes up or down, that’s not a meaningful measure of the department’s performance. Scores could have gone up because people are taking different classes (that are more enjoyable to take). Lots of corporate climate survey questions may seem important on their face but under deeper examination really don’t tell us a lot.

One of the things I typically ask clients when they appear to be on one of these data goose-chases, collecting all sorts of data that I’m not sure is that valuable is to ask the client a simple question. I ask “what will you do with the data?”. That question stops a lot of clients in their tracks. They might have reasons for why it’s useful to gather that data (“it tells us if the training department is doing a better job” or “we want to know if the employees like our office space”) but those don’t explain what they’ll DO with the information once they get it. Would they realistically award bonuses to staff (or conversely–fire or demote people) because average Likert scores changes? If employees indicated that they didn’t like the office space, does that mean they’d buy new furniture?

Asking clients “what will you do with that data?” is a really good reality test for some types of questions. كيف تربح المال من الالعاب It forces clients to explain what they intend to do with the information once they get it. And what you’ll often find is that clients say things like “it will tell us if the training courses are valued by the employees”–in which case maybe we should ask that question instead (or look at other indicators like–how often employees blow off classes they’e signed up for) Or if they want to know if employees like the office space as a clue to determine if they’ll leave, maybe we should just measure employee retention and then do exit interviews to better assess why employees leave.

The Checklist Manifesto

Anyone who is familiar with my work or my publications knows that job aids are near and dear to my heart.  My third book (Job Aid Basics) is about the subject.  Any serious performance student or consultant knows about the power of job aids—about how they are a cheap and effective way of improving performance.  Well, there is a great new book out by Surgeon Atul Gawande called The Checklist Manifesto. اربح المال

Checklists are just one example of a job aid. طريقه لعب البوكر  What is a job aid?  A job aid is a device or tool used to improve memory or confidence on the job and thus overall performance.  A wrench is not a job aid (it’s just a tool).   But a checklist (which reminds us of what to do), a recipe with steps (so we don’t add the eggs too soon), a trouble-shooting guide on how to figure out why the car doesn’t start—these are all job aids.

Gawande writes about a number of examples in this great book but his first primary examples involves healthcare.  He examines the case of the Johns Hopkins ICU where using a simple 5 bullet checklist, the staff reduced central line infections from 11% to 0% saving an estimated 43 infections, 8 lives and 2 million dollars per year.  Gawande and a team then went to a number of hospitals around the world and tried the same approach from rural Tanzania to Seattle.  Using a 19-point checklist for surgery, they found that EVERY hospital experienced a significant drop in post-operative complications and deaths.  In the 6 months after the checklist was introduced complications fell by an average of 36% and deaths fell by an average of 47%.  This was no new technology, no other major changes or influx of talent or resources—just the use of the checklist during surgery.

Performance consultants know about job aids.  Joe Harless gets credit for having coined the term.  Job aids are often a faster, cheaper alternative to training.  They’re an underutilized way of improving performance and a useful tool in the performance consultant’s tool box.

Gawande has done us performance consultants a tremendous favor.  He’s got a significant following (staff in the Obama White House look at his writings, both this book and his previous one Better about improving performance).  Dr. Gawande has provided very specific, tangible and quantifiable examples about how performance can be radically improved with even just simple tools or approaches. مواقع قمار  For all the clients out there who want to throw training at the problem or rehire a work force or change the bonus structure, Gawande’s work is a useful tool to help us make the case for a performance-based approach to improvement.

Performance Consulting and the Holidays

You may be one of those people who has spent a chunk of your time this holiday season doing some serious, big-time shopping. Whether it was face-to-face or online, at one point or another, you probably encountered some service failings. Maybe it was the inability of the clerk to answer your questions. Or the online database that kept you from purchasing that gift that would have been just perfect for your dear Aunt Margaret. Or the sales associate that was rude or unwilling to help. And, your initial and dominant response in those situations was probably one of aggravation and frustration with thoughts like “whatever happened to customer service in this country?” or “how do they expect to get any sales with experiences like this?” And I won’t even address what it was like trying to find parking at the mall or dealing with rude, pushy shoppers. موقع روليت

Here are a couple of suggestions for you when you encounter that situation…

First, be willing to rise above such “slights” and be bigger than the moment. ماكينات القمار This can be a very special time and it’s a shame to instead let someone else push your buttons and thus fail to enjoy all of the pleasures around you. Regardless of your religious beliefs, this time of year should be about bigger things than paybacks or complaints or upsets. شرح لعبة روليت

Second, put on your performance consultant’s hat. Move beyond the initial, gut reaction of why this behavior or experience happened. Treat this experience as if you were a performance consultant on assignment and you were supposed to deal with this specific task. How could you calculate the business impact of the performance issue? What contributes to this “sub-optimization”? How could you define the desired performance in a way so it was phrased as an accomplishment that could be objectively measured and replicated consistently? What environmental factors contribute to this performance gap? What information sources would you want to be able to find out the answers to these questions–who would you want to observe or interview? And think about your consulting skills too…when your spouse or roommate offers one of these venting stories (“You wouldn’t believe about this jerk I had to deal with trying to order that gift for my parents!”), ask the questions you’d need to ask a client that would move them from a frustrated insistence on training as a fix to instead a deeper understanding of what the problem is and how it persists. If you can’t get that kind of understanding with someone you love, how do you expect to do it with a client who is far less emotionally connected to you?

Third, go out and do something special for someone you love. Or someone you don’t know. Rather than getting stewed at the bad experience, bring some good into your little piece of the world. Practice random acts of kindness and senseless acts of beauty. Give back. In some small way, look for opportunities to help others or make the world a better place even if only in a modest fashion. Those things should be part of the holiday season and yet aren’t limited to just the holidays. And I’d like to think that being a performance consultant is consistent with all of it.

Peace and happy holidays to you.

–Joe

Elevator Speeches and Old Friends: A Perspective From South Africa

I welcome the chance to contribute to the Willmore Consulting Group blog. Joe, your comments and cautions about the relative merits of an”elevator pitch” and “audio logo” for performance consulting are probably an appropriate way for me to link up with you again, after our interesting discussion in Washington DC a few months ago, during the ASTD conference there. In fact, it was probably my old (correction: venerable? valued and versatile!) friends Jim and Dana Robinson who introduced me to you, as well as to the concept of a performance consulting “elevator pitch” back in 1981, when they first came out to South Africa on a combination honeymoon trip & “Partners in Change” analysis of our performance improvement strategies in the Edgars Retail Group, which covered over 300 branches in several Southern African countries. As the Edgars Group Training & Development Manager, I had met Jim at ASTD in Anaheim in 1980, as well as comparing our learning, goal setting and basic performance management approaches with what was being achieved in JC Penney, Sears Roebuck, and other US retailers.At that time, we didn’t even know what an “elevator” was, as we used “lifts” to get us up and down our buildings…….but Dana and Jim soon got us all reviewing the many organisational and leadership factors that were enhancing or inhibiting learning, motivation and results, in our complex company and society. The initial scepticism of divisional HRD and line managers about “what can these Americans teach us, after 3 weeks in Africa?”, was replaced by enthusiastic responses to their practical questions and assessment tools, which subsequently became embedded in the corporate culture……and Edgars executives became recognised as thought leaders in assessment centres and innovative merchandising. Continue reading “Elevator Speeches and Old Friends: A Perspective From South Africa”