Wednesday, 26 August 2015

The Smartest Human Resources Officers are 'Knowledge Smart'

According to www.Prospects.ac.uk (the UK's leading student and graduate careers advice service) Human Resources (HR) Officers must have a clear understanding of their employer's business objectives and be able to devise and implement policies which select, develop and retain the right staff needed to meet these objectives.

The smartest HR Officers are therefore subscribing to KnowledgeSmart to assess the CAD and BIM skills of new hires and to guide skills improvement over time.

We use KnowledgeSmart to enable our clients to recruit smarter candidates and to prove their ability; we find that it ensures better outcomes. By replacing guesswork with the KnowledgeSmart skills assessment, our clients know precisely how and where their candidates will add value to their organisation before they hire them.

Mike Johnson Managing Director, JohnsonBIM, London, UK

In addition, HR Officers use KnowledgeSmart skills assessments to:

- Liaise with team leaders to identify skills gaps and improve staff performance;
- Interview and select job candidates;
- Develop and implement policies for performance management;
- Advise on pay and other remuneration issues, including promotion and benefits;
- Plan, and sometimes deliver, training - including inductions for new staff;
- Analyse training needs in conjunction with departmental managers.

With subscribers spread across 6 continents KnowledgeSmart online skills assessments are used by architects, engineers, construction firms and contractors, to great effect.


Using KnowledgeSmart Skills Assessments to Inform Annual Appraisals

At HOK – a global design, architecture, engineering and planning firm with a network of 24 offices on 3 continents – the HR team actively supports staff to be the best they can be. This in turn delivers significant value for individuals and the organisation at large. With a focus on continuous improvement, HOK's skills assessment results are reviewed annually as part of the appraisal process and continuous improvement is rewarded through the company bonus scheme and through improved promotion opportunities.

We have learned lots of back-end lessons using KnowledgeSmart. Most notably, we have realised how and where we can improve productivity and profitability for the whole company.

David Ivey Building Smart Manager, Chicago, USA

HR teams, operating at the highest level within HOK, welcome this approach to integrated assessment and support the Building Smart Managers in their work. With similar motivations for improved service and performance, Stantec – a 15,000-person strong professional services firm – is embracing KnowledgeSmart to proactively make skills development a corporate initiative.

We have made the employee and his supervisor responsible for identifying goals and for measuring progress as part of the annual review process. This empowers and supports teams and enables individuals to reach their full potential and KnowledgeSmart is an invaluable ingredient.

Susan Clark-Serdiuk HR lead at Stantec, Toronto, Canada


HR Officers are Increasingly Knowledge Smart when Hiring New Staff

Smart recruitment firms and HR Officers are turning to KnowledgeSmart for insight when hiring. Gone are the days of simply sifting through CVs and arranging interviews, firms now demand more from their recruitment teams.

Hiring new staff without first assessing their BIM and CAD skills is an extraordinarily expensive way to do business as many firms may not learn how bad the new recruits have performed until much damage is done to the project's data quality. With KnowledgeSmart BIM Skills Assessments we gain the insight we need to make the right decisions.

Senior Director of Professional Services, US CAD, California, USA

To avoid such unnecessary expenditure and to reduce their risk, California-based US CAD invites all job candidates to complete a KnowledgeSmart BIM skills assessment and makes it clear from the outset that there is no intention to fail anyone. US CAD openly uses the BIM skills assessment results to better plan for training needs and to benchmark a candidate's assessed-ability against his own claimed-ability to better understand the individual.


HR Officers are Bringing KnowledgeSmart Results to the Project Resourcing Table

At RTKL – a worldwide architecture, engineering, planning and creative services organization – HR leadership challenged its BIM management team to source a BIM skills analysis tool to help the firm make smarter, more well-informed data-driven decisions about people's CAD and BIM skills when resourcing project teams. The team reviewed the market and selected KnowledgeSmart.

HR generally understands 'why' people were hired and provides a high-level view of the firm's talent base, but historically it has not fully appreciated some of the finer details about CAD and BIM skills. The nature of project-centric resourcing determines that individuals are moved from one project team to another as the workflow rises and falls across a project's lifecycle, and their BIM skill-levels will critically influence their ability to contribute to their new team. With KnowledgeSmart results and analysis in hand, HR now has perhaps the most comprehensive oversight of all staff and their skills.

Andrew Victory Associate, RTKL, London, UK


Identifying Skills Gaps for Training Needs Analysis by HR Teams

Australian-based A2K Technologies – one of the largest Autodesk channel partners globally – finds that HR Officers are very often involved in purchase decisions for training. Furthermore, A2K finds that HR Officers are very interested in benchmarking for greater insight into team performance; enabling them to extend their role beyond welfare and Health & Safety.

As the recession fades and growth again is the driver for change, firms are increasingly motivated to retain their best staff. KnowledgeSmart helps HR Officers measure their teams' software skills as part of their culture to support, reward and retain value-adding staff.

Sean Twomey ANZ Consulting Manager, A2K Technologies, Sydney, Australia

A2K also believes that KnowledgeSmart provides younger staff with targets to aim for; finding that KnowledgeSmart results can contribute to the building of a framework for ambition and career growth.

If you are a HR Professional, looking for a better way to measure staff skills and interviewee experience, why not get in touch and request a free trial of the KS tools?

R

Thursday, 6 August 2015

A Well-trained User is a User for Life

In April 2015 a brief published by “Bersin by Deloitte” reported that “…organizations which perform well on business outcomes have a talent strategy.”  It went on “…of those organizations rated least effective at business outcomes, between 70 percent and 80 percent lack a talent strategy.”  Naturally the ‘Learning Strategy’ is an essential ingredient of any successful talent strategy.

 
It should therefore come as no surprise that KnowledgeSmart subscribers may fall into the top 20% of all similar organisations as they are able to identify skills gaps, target training investment, and report measurable productivity gains with a clarity that is otherwise out of reach.


“We couldn’t have achieved such marked levels of increased productivity without a total solution for skills analysis and improvement.”


Peter Taylor, Engineering Systems Manager, Tata Steel Projects, Manchester, UK.


While an organisation’s talent is a strategic priority, skills improvement is a functional requirement for improving team productivity and data quality when collaborating on AEC and BIM projects.


Using KnowledgeSmart to demonstrate BIM-readiness

 
At USCAD – a BIM and manufacturing industry consultant and one of the largest Autodesk Authorized Platinum Reseller Partners in the USA – the transparency afforded by KnowledgeSmart’s Revit skills assessment results is clear.  


“As firms are fighting hard to win new projects, many say they are “BIM ready” but few can truly prove they are “BIM Ready” without skills assessment results as proof to validate their claims.  Including ‘BIM’ on your resume is no longer a reliable indicator of competence; you have to back up your claims with hard facts.”


Jeff Rachel, Vice President, USCAD, Las Vegas, USA.


The US CAD team leads by example, using KnowledgeSmart skills analysis services to assess their own staff and to support their business case when pitching to supply professional services and training.


Thinking Outside of the Box

 
North American firm Stantec – a multi-discipline AE firm with 15,000 employees – has turned training on its head since adopting KnowledgeSmart.  Gone are the days of training staff to use software without first assessing their abilities; the process was too long winded and too expensive and in many instances quite unnecessary when a user already had strong CAD and BIM software abilities.


“We have named our strategy for learning and development ‘Flipping the Classroom’ because the process for skills improvement and assessment has been reversed.  Today, a user’s BIM skills are assessed using KnowledgeSmart before they register for any training courses.  We then use the results to analyse and recommend specific training modules to help the user fill in their knowledge gaps as identified by their KnowledgeSmart score.”


Jim Marchese, Corporate BIM Education Leader, Stantec, Philadelphia, USA.


As a bonus, Stantec leadership now enjoys better reporting insight into CAD and BIM productivity and skills;  using the KnowledgeSmart skills gap analysis to provide insight and a business case for continued training investment.


Using KnowledgeSmart Scores to Build a Business Case for Training


A2K Technologies is one of the largest Autodesk channel partners globally with offices throughout Australia, New Zealand and China.  A2K is also a KnowledgeSmart advocate:


“Training decisions are easier to make when informed by needs.  The days of filling classrooms with people to sit through a multi-day training course are numbered.  Using skills assessment services from KnowledgeSmart we are now able to tailor our training courses to their skill level.  Saving valuable time and budget for additional activities.”


Sean Twomey, ANZ Consulting Manager, A2K Technologies, Sydney, Australia.


Some firms are using their KnowledgeSmart assessment scores to build a business case for securing training investment.  By first demonstrating a need to train, the training investment is easier to secure.  Better still, post training assessments are a proof point for skills improvement and demonstrate measurable return on investment (ROI).


Plugging Skills Gaps via Bite-sized Training


As new software is released and adopted, upgrade training is required to maximise the benefit realised.  However, many teams cannot spare large chunks of time away from their desk when a project is active.


“Big-Gulp training isn’t easy to fund or organise:  It takes people away from project work for too long in one sitting, and it is disruptive to production workflows.  For Stantec as a large geographically distributed enterprise we find that ‘self-paced’ modular training makes more sense.  By first testing our staff to benchmark their skills we can very efficiently prescribe a learning development path which limits time away from the project and maximises the new skills learned.”


David Spehar, Corporate BIM Lead, Stantec, Stantec, Cleveland, USA.


By combining one of the many online self-paced learning environments with KnowledgeSmart, subscribers will increase their learning efficiency and reduce their training costs to spend more time delivering projects.


Training can be a Team Sport – But Not Always

 
At RTKL – a multi-discipline design practice with global presence – training is regarded as a group activity, often delivered via online technology to enable a very open and flexible forum for discussion.  


“Such group sessions can be very hard to manage effectively as there is invariably a delta between the stronger and weaker users which can leave some feeling overwhelmed while others may feel bored as the training lacks pace.”


Andrew Victory, Associate, RTKL, London, UK.


For RTKL however this is no longer an issue.  Using KnowledgeSmart’s online skills assessments, RTKL first assesses each user’s skills before grouping users by ability for training.  And as HR owns the training budget – allocating it as needs demand – KnowledgeSmart helps to make the budget go as far as possible by benefitting the neediest for maximum training ROI.


Playing the Long-Game and Trusting KnowledgeSmart Scores

 
HOK – a global design, architecture, engineering and planning firm with a network of 24 offices on three continents – is now in its 4th year with KnowledgeSmart and has taken its time to build up a depth of knowledge through annual skills assessments.


“The whole team is now able to trust the KnowledgeSmart assessments and results to clearly connect skills improvement to training investment.  We have learned lots of back-end lessons using KnowledgeSmart, most notably, we have realised how and where we can improve productivity and profitability for the whole company.”


David Ivey, Building Smart Manager, HOK, Chicago, USA.


Today annual skills assessment results indicate areas which require improvement and bite-size training courses are arranged through the “HOK University” – a global learning initiative setup for all HOK staff to improve skills in all work-related areas including CAD and BIM software.


Training Content Providers use KnowledgeSmart Score to Improve their Own Offerings

 
White Frog offers strategic educational planning when adopting BIM technology and workflows and works with AEC firms to select, refine, and apply standards and protocols for BIM use prior to adoption.  Generally all training programs embrace AEC Standards and where necessary include other international standards such as those from the USA including LEED, AIA, etc.


“We offer the opposite of ‘BAG’ training (Basic, Advanced, Good Luck!).  We start by assessing all existing Revit users with KnowledgeSmart before they embark on a rolling Revit learning program.  We also use KnowledgeSmart to inform our own developments.  By reviewing test results from our customers we can sometimes identify areas of our own training materials and processes that need to be improved.  For instance if some questions never get a strong score it may be because there are improvements required to our service delivery and training course creation.  So we use those test results to improve our own offerings.”


Paul Woddy, Technical Director, White Frog, Shrewsbury, UK.


Before adopting KnowledgeSmart, White Frog reviewed other skills testing solutions but found them all wanting in one way or another.  Some offered no randomisation of questions while others were missing robust analysis and insight tools.  For the BIM experts at White Frog, KnowledgeSmart was the only complete solution.


Using KnowledgeSmart Services to Maximise the Value of Software and Training Subscriptions

 
Tata Steel Projects (TSP) – a multi-disciplinary engineering consultancy working on rail, nuclear, security, energy, power and construction – is a mixed Bentley and Autodesk shop that has an Open Access agreement with Bentley for software procurement.  TSP tested the skills of every single user in their organisation via KnowledgeSmart and then analysed the results to identify skills gaps/trends and build custom learning paths to streamline access to rich learning resources in Bentley LEARN including online videos, tutorials, virtual classroom, etc.


“One year later, after completing targeted training, our retest scores are 30-40% higher; providing substantial return on investment (ROI).  We couldn’t have achieved these levels of improvement without a total solution for skills analysis and improvement.”


Peter Taylor, Engineering Systems Manager, Tata Steel Projects, Manchester, UK.


KnowledgeSmart is now central to TSP’s talent strategy.  All future assessments will be administered by HR and tied in with annual appraisals for continued skills improvement.
 
R

Thursday, 11 June 2015

Assembling an “All-Star” Team

According to the Harvard Business Review, “…evidence from a variety of industries suggests that star employees outperform other employees by a country mile”.  Indeed in their 2013 article ‘Making Star Teams Out of Star Players’ the HBR wrote: “When it comes to an organization’s scarcest resource—talent—the difference between the best and the rest is enormous. In fields that involve repetitive, transactional tasks, top performers are typically two or three times as productive as others.”  KnowledgeSmart subscribers agree.

 “We know we’ve hired super stars when they score highly on KnowledgeSmart.  We also know that we can plug super stars into any team and they’ll increase team productivity.”

Peter Taylor, Engineering Systems Manager, Tata Steel Projects (TSP), Manchester, UK.

Taylor is responsible for reviewing emerging software offerings as he develops and maintains the strategy for best use of software on TSP projects. 

Being ‘KnowledgeSmart’ when resourcing Projects 

At Stantec – a multi-discipline AE firm with 15,000 employees – KnowledgeSmart skills analysis improves insight for management when comparing skills, knowledge, and performance across project teams.  Stantec also uses KnowledgeSmart to improve the productivity of teams and by balancing team skills.  

 “Where a team is already resourced, we use KnowledgeSmart scores to ‘Re-shuffle the deck’ so that we may better position people to maximise the effectiveness of their contributions.  For Stantec this ability to ‘course correct’ teams and individuals delivers increased efficiencies.”

 Jim Marchese, Corporate BIM Education Leader, Stantec, Philadelphia, USA.

It’s not just AEC firms who benefit by resourcing projects with the best talent.  HBR states that the best developer at Apple is at least nine times more productive than the average software engineer at other technology companies.

Daily Decisions Should be Data Driven

At A2K – one of the largest Autodesk channel partners globally with offices throughout Australia, New Zealand and China – they recognise that the world has changed.   They claim that with 20-20 insight into software skills, project team leaders can now resource projects with greater levels of certainty for higher quality outcomes.  

 “With customer loyalty programs gathering data about our every purchase and with Amazon predicting what we are ‘likely’ to ‘want’ to buy in future, BIG DATA has changed how we interact and behave.  In much the same way we can now use KnowledgeSmart BIM skills assessment results to remove emotion from the project team resourcing process, to guarantee the best outcomes.”

 Sean Twomey, ANZ Consulting Manager, A2K Technologies, Sydney, Australia.

Twomey is not alone; “Old school subjective assessments are unreliable.” explained Lonnie Cumpton, Senior Director of Professional Services at US CAD.  “Such assessments are mere opinions unsupported by facts.  And when the opinion is wrong it very often proves to be “an extraordinarily expensive way to do business.”  He concluded.

 Informed Inter-Team Mobility Delivers Productivity Gains

AT RTKL – a multi-discipline design practice with global presence – KnowledgeSmart is being used to make smarter, more well-informed data-driven decisions about people’s software skills when resourcing project teams.  

 “HR is involved in the process of project resourcing when new project teams are assembled.  HR knows ‘why’ people were hired and retains a high-level view of the firm’s talent base.  At RTKL, the nature of project-centric resourcing determines that individuals are moved from one project team to another as the workflow rises and falls across the project lifecycle.”

 Andrew Victory, Associate, RTKL, London, UK.

HOK agrees.  “HOK is committed to making sure every team member’s skills and knowledge are as deep and as broad as can be to streamline the project resourcing process.”

 Even Super Stars Need to Train

Back at Stantec, the HR department recognises that super stars are not ‘born’ they are ‘developed’.  From years of experience, Stantec knows that world-class talent of all kinds will multiply the productivity and performance advantages of an all-star team.

 “By providing JIT training we are making it easier to resource projects.  We can select the right people for a given project based on their domain skills and using KnowledgeSmart skills assessments we can identify and eliminate skills gaps by completing targeted ‘small sip’ training modules.”

Susan Clark-Serediuk, HR Leader, Stantec, Toronto, Canada.

In this way Stantec is gradually assembling a series of all-star teams.

HBR’s key takeaway

 “Companies that are good at managing “A” players keep comprehensive, granular data on where their people are currently deployed, what those people do, how good they are in their current roles, and how transferable their skills may be.”

KnowledgeSmart companies use that information to continually improve their staffing resources and to deploy them more effectively.

R

Monday, 8 June 2015

Training Needs Analysis (TNA)

Popular AEC trainer and author, Shaun Bryant, is writing a 3-article series for AEC Magazine, about how design and engineering firms make the move to BIM.

Part one of the series can be found here.

In part two, Bryant addresses the issue of identifying training needs, in the context of a wider Revit software implementation.

Here is an excerpt from the article:

CAD manager to director level

“We will need to ensure that all Revit users undergo a Training Needs Analysis (TNA) to assess their training needs and requirements. This will give us a picture of exactly what they need and allow us to use the training effectively and get the most out of our incumbent training provider. It is imperative we allow the team to work with their strengths but also get trained up on areas of weakness, so that we have fully rounded Revit users that are effective and productive.”

CAD manager to the team

“We need you all to undergo a Training Needs Analysis (TNA). This is to assess your existing Revit knowledge (if you have any) and what areas you need to work on to make sure you are fully trained on every aspect of Revit you need to perform your role within the practice effectively. We need you to make sure that you include everything in your TNA so that we can get the best training for you from our training provider.”

Effective training on any CAD product is imperative. The CAD manager is using the TNA to ensure each Revit user is trained to their strengths and that any areas where their product knowledge is weak is thoroughly assessed and appropriate training given. The TNA is done individually per user to make sure that each user gets training tailored to them. It also provides the user with the reassurance that with the new CAD product, in this case Revit, they will be fully trained and prepared to use the product on live projects that the practice is, or will be, working on.

The full article can be found here.

KnowledgeSmart is a popular choice for AEC firms who are looking to capture accurate, independent data on software skills gaps, across their business. Detailed training needs analysis for individuals, teams and organisations is a key deliverable of the KS tools.

R

Thursday, 14 May 2015

Why self-assessment is (probably) fundamentally flawed (part two)

This is part two of my case against the use of self-assessment, as a reliable exclusive means of measuring staff performance.

The Latin maxim “ignoramus et ignorabimus”, meaning "we do not know and will not know", stood for a position on the limits of scientific knowledge in the nineteenth century.

In September 1930, mathematician David Hilbert pronounced his disagreement in a celebrated address to the Society of German Scientists and Physicians, in Königsberg:

“We must not believe those, who today, with philosophical bearing and deliberative tone, prophesy the fall of culture and accept the ignorabimus. For us there is no ignorabimus, and in my opinion none whatever in natural science. In opposition to the foolish ignorabimus our slogan shall be: We must know - we will know!”

In the late twentieth century, Ex-United States Secretary of Defense, Donald Rumsfeld, whilst defending his country's position on the Iraq war, made the following (now infamous) statement; “There are ‘known knowns’. These are things we know that we know. There are ‘known unknowns’. That is to say, there are things that we now know we don’t know. But there are also ‘unknown unknowns’. These are things we do not know we don’t know”.
 
There are four recognised stages of competence.

1) Unconscious Incompetence
The individual neither understands nor knows how to do something, nor recognizes the deficit, nor has a desire to address it. The person must become conscious of their incompetence before development of the new skill or learning can begin.

2) Conscious Incompetence
Though the individual does not understand or know how to do something, he or she does recognize the deficit, without yet addressing it.

3) Conscious Competence
The individual understands or knows how to do something. Demonstrating the skill or knowledge requires a great deal of consciousness or concentration.

4) Unconscious Competence
The individual has had so much practice with a skill that it becomes “second nature” and can be performed easily. He or she may or may not be able to teach it to others, depending upon how and when it was learned.

"The Invisible Gorilla" experiment is one of the most famous psychological demo's in modern history. Subjects are shown a video, about a minute long, of two teams, one in white shirts, the other in black shirts, moving around and passing basketballs to one another. They are asked to count the number of aerial and bounce passes made by the team wearing white, a seemingly simple task. Halfway through the video, a woman wearing a full-body gorilla suit walks slowly to the middle of the screen, pounds her chest, and then walks out of the frame. If you are just watching the video, it’s the most obvious thing in the world. But when asked to count the passes, about half the people miss it. It is as though the gorilla is completely invisible.
(http://www.theinvisiblegorilla.com/gorilla_experiment.html).

In his popular KM blog, Nick Milton (http://www.knoco.co.uk/Nick-Milton.htm) writes in detail about the impact of this experiment and picks up on a number of key trends discussed in the book of the same name, authored by Christopher Chabris and Daniel Simons (the guys behind the original experiment).

The subtitle of the book is "ways our intuition deceives us", and the authors talk about a number of human traits (they call them illusions) which we need to be aware of in Knowledge Management, as each of them can affect the reliability and effectiveness of Knowledge Transfer.

To paraphrase Milton, the illusions which have most impact on Knowledge Management are;

• The illusion of memory
• The illusion of confidence
• The illusion of knowledge

Our memory of events fades over time, to the point that even firm documentary evidence to the contrary doesn't change what we remember. The implication is that if you will need to re-use tacit knowledge in the future, then you can't rely on people to remember it. Even after a month, the memory will be unreliable. Details will have been added, details will have been forgotten, the facts will have been rewritten to be closer to "what feels right".

Tacit knowledge is fine for sharing knowledge on what's happening now, but for sharing knowledge with people in the future then it needs to be written down quickly while memory is still reliable.

Without a written or photographic record, the tacit memory fades quickly, often retaining enough knowledge to be dangerous, but not enough to be successful. And as the authors say, the illusion of memory can be so strong that the written on photographic record can come as a shock, and can feel wrong, even if it’s right.

Any approach that relies solely on tacit knowledge held in the human memory can therefore be very risky, thanks to the illusion of memory.

The illusion of confidence represents the way that people value knowledge from a confident person. This would be fine if confidence and knowledge go hand in hand, but in fact there is almost an inverse relationship. A lack of knowledge is, instead, allied to overconfidence. Lack of knowledge leads to confidence, which leads to you being seen as knowledgeable.

Each chess player is given a points rating based on their competition results, which is in fact a very effective and reliable measure of their ability. Yet 75% of chess players believe they are underrated, despite the evidence to the contrary. They are overconfident in their own ability.

In studies of groups of people coming together to solve a maths problem, you would expect the group to defer to the person with the greatest maths knowledge, wouldn't you? In fact, the group deferred to the most confident person, regardless of their knowledge. In trials, in 94% of the cases, the final answer given by the group is the first answer suggested, by the most confident person present, regardless if whether it is right or wrong.

In a Harvard study of confidence vs knowledge in a trivia test, they certainly saw overconfidence in individuals - people were confident of their answer 70% of the time, while being correct only 54% of the time! When people were put together in pairs, the counterintuitive outcome was that the pairs were no more successful than the individuals, but they were a lot more confident! When two low-confidence people were put together, their overall confidence increased by 11%, even though their success rate was no higher than before.

The Illusion of Knowledge is behind the way we overestimate how much we know. The authors refer to how people think they know how long a project will take, and how much it will cost, despite the fact that projects almost always overrun in both cost and time. "We all experience this sort of illusory knowledge, even for the simplest projects" they write. "We underestimate how long they will take or how much they will cost, because what seems simple and straightforward in our mind typically turns out to be more complex when our plans encounter reality. The problem is that we never take this limitation into account. Over and over, the illusion of knowledge convinces us that we have a deep understanding of what a project will entail, when all we really have is a rough and optimistic guess based on shallow familiarity"

"To avoid this illusion of knowledge, start by admitting that your personal views of how expensive and time-consuming your own seemingly unique project will be are probably wrong. If instead, you seek out similar projects that others have completed, you can use the actual time and cost of these projects to understand how long yours will take. Taking such an outside view of what we normally keep in our own minds dramatically changes how we see our plans"

If we are unaware of these 3 illusions, we can feel confident in our knowledge, based on our memories of the past, without realising that the confidence is false, the knowledge is poor, and the memories are unreliable and partially fictitious. Awareness of these illusions allows us also to challenge the individual who confidently declares "I know how to do this. I remember how we did it 5 years ago", because we recognise the shaky nature of confidence, knowledge and memory.

A natural human tendency is that we tend to think that we know more than we do and that
we are better than we are. We suffer from what psychologists call the “Lake Wobegon effect”.
Based on Garrison Keillor’s fictional town where “all the women are strong, all the men are good-looking and all the children are above average.” According to the author’s own survey, 63% of Americans consider themselves more intelligent than the average American.

In contrast, 70% of Canadians said they considered themselves smarter than the average Canadian. In a survey of engineers 42% thought their work ranked in the top 5% among their peers. A survey of college professors revealed that 94% thought they do ‘‘above average’’ work – a figure that defies mathematical plausibility! A survey of sales people found that the average self-assessment score (for sales demos) was 76%. The average % of demo's that achieved objectives (for the same group) was 57%. The list goes on..

So, in summary, any strategy for capturing user skills data, which relies solely on an individual's ability to self-rate themselves on a given subject, is simply doomed to fail. I leave the last word to David Dunning; “In essence, our incompetence masks our ability to recognize our incompetence”.

R

Tuesday, 12 May 2015

Why self-assessment is (probably) fundamentally flawed (part one)

A while back, we wrote a paper about the pro's and cons of using self assessment, as an exclusive means of gaining useful corporate intelligence and capturing management metrics for staff performance.

Over the years, many AEC firms have confidently stated, 'We don't need independent skills testing, we already know how good our teams are'. When one enquires further, what they actually mean, is that they sent out a user survey, asking staff to rate themselves (usually out of 5) on a range of different skills topics, including AutoCAD, Revit, BIM, etc. What they end up with is a spreadsheet (why is it always a spreadsheet?) with a list of names down one side, a list of skills categories across the top - and a sheet filled with 3's and 4's. Why 3's and 4's, I hear you ask? Simply because people don't want to risk the personal penalties that might go along with admitting they're a 1 or a 2. And conversely, they don't want to stick their head above the parapet by admitting to a 5 (even if they are a 5) because this can cause all sorts of new issues (more work, more people pestering them for answers to the same questions, you get the picture). So it's 3's and 4's all the way.

Congratulations XYZ Engineers, you have your completed spreadsheet, so you are now totally self-aware, as an organization. (Not really). And the real rub here is that, more often than not, people have no clue how good they are, relative to the rest of the team, or wider industry!

So, we decided to explore this concept in greater detail. Here's the evidence..

Let us begin with a story. A few years back, NY Times Online posted a series of articles by filmmaker Errrol Morris. He tells the tale of Bank robbery suspect McArthur Wheeler, who was recognized by informants who tipped detectives to his whereabouts after his picture was telecast one Wednesday night, during the Pittsburgh Crime Stoppers segment of the 11 o’clock news. At 12:10 am, less than an hour after the broadcast, he was arrested. Wheeler had walked into two Pittsburgh banks and attempted to rob them in broad daylight.

What made the case peculiar is that he made no visible attempt at disguise. The surveillance tapes were key to his arrest. There he is with a gun, standing in front of a teller demanding money. Yet, when arrested, Wheeler was completely disbelieving. “But I wore the juice,” he said. Apparently, he was under the deeply misguided impression that rubbing one’s face with lemon juice rendered it invisible to video cameras.

Pittsburgh police detectives who had been involved in Wheeler’s arrest explained that Wheeler had not gone into “this thing” blindly but had performed a variety of tests prior to the robbery. Although Wheeler reported the lemon juice was burning his face and his eyes, and he was having trouble (seeing) and had to squint, he had tested the theory, and it seemed to work. He had snapped a Polaroid picture of himself and wasn't anywhere to be found in the image.

There are three possibilities:
(a) the film was bad;
(b) Wheeler hadn’t adjusted the camera correctly; or
(c) Wheeler had pointed the camera away from his face at the critical moment when he snapped the photo

Pittsburgh Police concluded that, 'If Wheeler was too stupid to be a bank robber, perhaps he was also too stupid to know that he was too stupid to be a bank robber - that is, his stupidity protected him from an awareness of his own stupidity.'

Now, this sorry tale might have been just another footnote in history, were it not for the fact that it came to the attention of David Dunning, a Cornell professor of social psychology. After reading this story in 1996, Dunning wondered whether it was possible to measure one’s self-assessed level of competence against something a little more objective – say, actual competence.

Over the next 3 years, Dunning (assisted by colleague Justin Kruger) undertook a major academic study and, in 1999, published the paper, “Unskilled and Unaware of It: How Difficulties of Recognizing One’s Own Incompetence Lead to Inflated Self-assessments”.

Dunning’s epiphany was; “When people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden; not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, like Mr. Wheeler, they are left with the erroneous impression they are doing just fine. In essence, our incompetence masks our ability to recognize our incompetence”.

Dunning & Kruger also quote the “above-average effect”, or the tendency of the average person to believe he or she is above average, a result that defies the logic of statistics. Participants scoring in the bottom quartile on tests grossly overestimated their performance and ability. Although test scores put them in the 12th percentile they estimated themselves to be in the 62nd.

Conversely, Because top performers find the tests they confront to be easy, they mistakenly assume that their peers find the tests to be equally easy. As such, their own performances seem unexceptional. In studies, the top 25% tended to think that their skills lay in the 70th–75th percentile, although their performances fell roughly in the 87th percentile.

Dunning and Kruger proposed that, for a given skill, incompetent people will:

  tend to overestimate their own level of skill;
  fail to recognize genuine skill in others;
  fail to recognize the extremity of their inadequacy;
  recognize and acknowledge their own previous lack of skill, if they can be trained to substantially improve.

As a follow up study, “Why the unskilled are unaware: Further explorations of (absent) self-insight among the incompetent”, was published in 2006. (David Dunning, Justin Kruger, Joyce Ehrlingera, Kerri Johnson, Matthew Banner).

In part two, we'll take a look at the 4 stages of competence - and how the combined illusions of memory, confidence and knowledge can impact on a firms' knowledge management strategy.

R

Wednesday, 22 April 2015

Information is Beautiful

Skills assessments generate a lot of useful data. We have always been in the business of gathering information. And we have thousands of records to analyse. A key theme for KS this year is discovering new and interesting ways to anonymously present the findings of our global data capture.

As well as test topic, score & time values, we also capture optional additional background information about KS users, including:

Primary Industry/Discipline
Primary Role
Country
State (NB if US/CAN/Aus selected in 'Country' field)
Self rating (1-5)
How many years have you used BIM/CAD/Engineering software?
How often do you use BIM/CAD/Engineering software?
How did you primarily learn to use BIM/CAD/Engineering software?
Where did you first learn to use BIM/CAD/Engineering software? (Country/State) What BIM/CAD/Engineering software do you regularly use?
Please specify any other software.

Our goal in the coming months is to create a series of topical benchmarking stories from the information captured.

For example:

Average Revit Architecture test score/time for Architects based in California, using the software for 5 years or more, self-taught, part-time users, who learned in the USA.

Average Revit Structure test score/time for Structural Engineers based in NSW, Australia, using the software for 2-5 years, formally trained, full-time users, who learned in Australia.

Average MicroStation score/time for self-taught vs formally trained users.

Average AutoCAD score/time for < 5 year users vs > 10 year users.

And so on.

As you can see, there are dozens of possible permutations for interesting themes and stories buried in the data.  We just need to analyse and identify the best ones!

Here is one of our inspirations for creating amazing infographics:

Information is Beautiful by David McCandless.




Over the coming months, we'll share with you the most interesting stories, data-mined from our industry-leading skills assessment software.

R