Tuesday, 19 June 2012

3 Levels of Revit Proficiency


How do you identify someone who has basic skills in Revit and distinguish them from an advanced user?

Further, do AEC firms actually need all of their users to be 'specialists' in Revit?  In some instances, people just need to receive a file, open it up and perform some straightforward analysis of the model (without causing any problems).  For example, Construction Managers, PM's, Engineers, Project Architects, and the like.  So a basic working knowledge of some key concepts is perfectly adequate for these users.

For the primary modeling team, AEC firms require a broader level of understanding across their user base, for example the people tasked with creating the detailed production information and modeling outputs on a project.  And firms need to have a handful of Revit ‘specialists'; people with a deeper understanding of the way in which Revit is deployed on a project, an appreciation of the collaborative process and the impact of using the technology across a broader environment.  And lastly, if content creation is a key part of the firm’s process, a handful of Revit Family specialists, who understand the importance of creating quality content, that can be re-used on subsequent projects, not simply re-modeled from scratch every time.

So it makes sense to establish 3 recognised levels of competence.  In a broad sense, we could label them:

  • Level 1 – basic skills
  • Level 2 – intermediate skills
  • Level 3 – advanced skills

With respect to Revit, this could work as follows:

  • Level 1 – Basics of Revit (i.e. an assessment aimed at occasional users; Construction Managers, PM’s, Engineers, Project Architects, etc.)
  • Level 2 – Revit fundamentals (i.e. core skills for primary modelers) 
  • Level 3 – More advanced concepts of Revit, (i.e. Revit project process & workflow, collaborative working, Family creation, etc., for BIM Co-ordinators and model managers)

AEC firms can use skills assessment, as a means of demonstrating to clients (or prospective clients) and partners, that they have the skills required, to truly deliver on a BIM project.  

KS Library 

In the KnowledgeSmart library, we have a range of modules available, which can assist firms in creating an appropriate assessment program for their organisation to capture helpful benchmark data and skills gaps information.
  • Level 1 – Revit for occasional users (available Summer 12).
  • Level 2 – Revit Architecture fundamentals, Revit for Interiors, Revit MEP fundamentals, Revit Structure fundamentals.
  • Level 3 – Revit Project Process, Revit Process & Workflow, Revit Architecture advanced (available Summer 12), Revit Structure advanced (available Summer 12), Revit Content Creation.
Firms might also consider writing their own custom assessment modules, based on their BIM standards.  All users can demonstrate an understanding of the correct standards and protocols, by passing this module.

Self-Assessment

In the broader context of self-assessment, the “above-average effect” is the tendency of the average person to believe he or she is above average, a result that defies the logic of statistics.
In studies, participants scoring in the bottom quartile on tests grossly overestimate their performance and ability. Although test scores put them in the 12th percentile they estimate themselves to be in the 62nd.

Because top performers find the tests they confront to be easy, they mistakenly assume that their peers find the tests to be equally easy. As such, their own performances seem unexceptional.  In studies, the top 25% tend to think that their skills are in the 70th–75th percentile, although their performances fall roughly in the 87th percentile.

Business Benefits

There are five distinct business areas that can benefit from measuring team performance:

Project Technology Leadership:
  • Measure returns on technology investment to eliminate risk of waste.
Business Development Team:
  • Differentiate your services and benchmark your competitive advantage to win more work.
HR Leadership:
  • Optimise the mix of people and skills via regular assessment to retain top talent.
  • Filter out poorly skilled candidates at interview to avoid failed hires saving time and money.
  • Encourage on the job training to reduce time away from the office and increase staff impact.
Learning and Development Team:
  • Train staff based on individual needs to maximize impact and raise service quality.
  • Measure returns on learning and development investments to eliminate risk of waste.
  • Make your training budget go further by improving learning and development efficiency.
Project Delivery Team:
  • Resource project teams with a balanced mix of skills to deliver high-quality services profitably.
  • Create an environment of team learning and growth to improve client service levels.

Benchmarking

Benchmarking is the process of comparing the cost, time or quality of what one organization does against what another organization does. The result is often a business case for making changes in order to make improvements. Also referred to as "best practice benchmarking" or "process benchmarking", it is a concept used in management where organizations evaluate various aspects of their processes in relation to best practice, usually within their own sector. This then allows organizations to develop plans on how to make improvements or adopt best practice, with the aim of increasing performance. 

It is important to target your teams with clear goals for advancement, as part of your wider learning & development and talent management strategy.

Here are the current (2012) benchmark performance quartiles, for the 3 disciplines of Revit, for your firm to compare results against:

Revit Architecture 2012:  69% in 1 hour 4 mins 30 secs



Revit MEP 2012:  64% in 1 hour 9 mins 30 secs



Revit Structure 2012:  62% in 1 hour 6 mins 0 secs


New Hires

According to the report “U.K. Talent Acquisition Factbook 2011” by Bersin & Associates, the new hire failure rate for new recruits is 1 in 8. Put another way, for every 8 new recruits 1 will leave inside of 12-months because they were a bad fit for the company. The same report measured the average recruitment cost to be £5,300 GBP / $8,200 USD per person.

Learning & Development Data

According to data collected from software training professionals, many trainers spend approximately 40% of the training course time revising prior training topics to assess group skill levels. This ‘Redundant Training Time’ reduces classroom productivity, increases training costs, and keeps staff away from their project teams for longer.

Skills Gaps

Based on analysis of KS results data from firms located in the UK, US, Canada, Australia and New Zealand, here is a summary of the top 20 most commonly flagged training issues and skills gaps, for each of the 3 Revit disciplines.

Revit Architecture 

ViewCube
View Range
Borrowers
Family Editor
Families
Conceptual Massing
View Properties
Element Properties
User Interface
Detail Components
Worksets
Coordinates
Rooms
Walls
Doors
Floors
Calculating Area
Shortcuts
Schedules
CAD Files


Revit MEP

Apparent Load Values
Modifying Light Fixtures
Photometric Web Files
System Inspector
Creating Power Systems
Edit Properties
Panel Circuits
Panel Schedules
Selecting Equipment
Electrical Settings
Creating Supply Systems
Duct Systems
Load Family
Pipe Systems
Spaces
Family Elements
Plumbing Fixtures
Properties
Visibility/Graphics
Worksets


Revit Structure

Beams
Components
Cut Length
Element Properties
Roofs
Families
Instance Properties
Schedules
View Parameters 
Annotation
Scope Boxes
Visibility
Structural Deck
Span Direction Symbol
Slabs
Filters
Slope Arrows
Dimensions
Repeating Details
Structural Settings

For firms involved in the adoption, rollout and training of their teams in Revit, skills assessment and benchmarking is an essential ingredient, in the overall successful deployment of this technology.

Taking a more structured, scientific approach will help AEC firms, Construction Clients and Contractors alike, to get a much better handle on who knows what, with respect to deploying Revit on a BIM project.  It will also help firms to demonstrate their BIM credentials and give users a logical (and more appropriate) path for skills development.

R

Monday, 11 June 2012

Club Revit 2012 Knowledge Challenge - Results



For the past couple of months, we have been running an open Revit knowledge challenge, in association with leading industry network, Club Revit.  Contestants were presented with 24 questions, with a time limit of 30 minutes to complete the assessment.  The best score in the fastest time, determined the winner for each Track.

The competition is now finished, so let's review how it all went.

The Results 

Congratulations to the following contestants, who won their respective Tracks:

Revit Architecture Track Winner:  Duarte Couto

Revit MEP Track Winner:  Don Bokmiller

Revit Structure Track Winner:  John Fout


So, let's take a detailed look at the results of this year's Club Revit Knowledge Challenge.

Overall

No. of Tests Completed:  198
Overall Average:  66% in 24 mins 45 secs


Track 1 - Revit Architecture

Track Winner:  Duarte Couto
Winning Score:  99% in 15 mins 40 secs

Top 10 Contestants:

Duarte Couto        
John Fout 
Kate Hovis  
Melissa Thiessens       
Jeremy Stroebel     
Randy Rush    
Matt McKechnie
Morgan Blum
Rebecca Frangipane
Simon Gale

No. Completed:  139
Group Average:  68% in 24 mins 22 secs
Top 10 Average:  88% in 21 mins 0 secs
Bottom 10 Average:  21% in 30 mins


Track 2 - Revit MEP

Track Winner:  Don Bokmiller
Winning Score:  88% in 24 mins 50 secs

Top 10 Contestants:

Don Bokmiller
Julian Jameson
Sam Scorsone
Jason Seck
Rebecca Frangipane
Glen Walson
Juliana Milanov
John Fout
Jonathan Herrera
Thomas Maleski

No. Completed:  33
Group Average:  64% in 27 mins 29 secs
Top 10 Average:  81% in 24 mins 32 secs
Bottom 10 Average:  42% in 29 mins 20 secs



Track 3 - Revit Structure

Track Winner:  John Fout
Winning Score:  95% in 24 mins 25 secs

Top 10 Contestants:

John Fout 
Rebecca Frangipane
Ben Osborne
Jesse Mickle
Ben May
Victoria Prescott
Mike Bolduc
Kathleen Chapman
George Guevarra     
Isabella Risolvo 

No. Completed:  26
Group Average:  58% in 26 mins 14 secs
Top 10 Average:  81% in 24 mins 27 secs
Bottom 10 Average:  33% in 27 mins 57 secs



Further Analysis & Observations

Popular Tracks

The most popular tracks, in order of completed tests, were as follows:

Revit Architecture - 139 results
Revit MEP - 33 results
Revit Structure - 26 results


Training Needs Analysis

So what does all this mean, in terms of performance and training?

For Revit Architecture, the top 10 training items were, in priority order:
Key Schedules, Schedules, Tagging, 3D Components, Graphic Display, Detach File, File Management, Conceptual Massing, Model Management and Worksharing.



For Revit MEP, the top 10 training items were, in priority order:
Apparent Load Values, Edit Properties, Modifying Light Fixtures, Photometric Web Files, Annotation, Key Schedules, Schedules, Tagging, Model Management and Views.



For Revit Structure, the top 10 training items were, in priority order:
Key Schedules, Schedules, Tagging, 3D Components, Graphic Display, Types, Families, Clipboard, Data Integrity and Model Management.



Overall

It's interesting to compare results, across the 3 disciplines, which arguably reflects where Revit currently sits, in terms of wider industry adoption. Revit Architecture proved to be the most popular track and the overall experience of the users taking part in the contest suggests that this software is the most mature, in terms of individual expertise. Revit MEP and Revit Structure were close, in terms of overall numbers, with RMEP posting average scores 6% higher than RST, but the average elapsed time to complete the exercises was longer.

RAC overall:  68% in 24 mins 22 secs
RMEP overall:  64% in 27 mins 29 secs
RST overall:  58% in 26 mins 14 secs


Comparing the difference between the top 10 and bottom 10 performers in each category, is also a fascinating insight into the huge variance of practical ability, out there in the AEC industry.  The average for RAC was just 21% accuracy across the bottom 10 scores, with RST faring little better at 33%.  Overall, RMEP was the most consistent group, with a bottom 10 average of 42% (or, put another way, twice as accurate as RAC).

This sends out a rather important message to firms who recruit Revit specialists, without any means of skills assessment at interview!

Honourable mentions

Along with our track winners, the following contestants deserve a special mention, for their overall performance in the competition:

John Fout, for winning the RST track, placing second for RAC and eighth for RMEP.  A true all-rounder!

Rebecca Frangipane, for placing second in the RST track, posting a top 5 score in RMEP and top 10 for RAC.  Another exceptional performance!

Philip Russo, for posting above average scores in both RAC and RMEP tracks.

Dishonourable mentions

All those competitors who were disqualified for taking an assessment more than once in the same track, which was a definite no-no, as mentioned in the contest rules.  You know who you are!  The RAC track had the highest percentage of disqualified contestants.  Does this mean engineers abide by the rules more than architects??

Boys vs Girls

Finally, let's take a look at the demographic breakdown of the competition. Out of 198 contestants, 155 were male and 43 female. The average overall performance for each group breaks down like this:

Male: 66% in 25 mins 24 secs
Female: 68% in 24 mins 7 secs


So, there we have it! The girls are the winners in the Revit battle of the sexes!  Both more accurate and faster than their male colleagues.  The gauntlet has been thrown down!

Congratulations to all our winners. Thanks to everyone who took part in the competition.  And special thanks to Lonnie Cumpton and the Club Revit team for making it all happen.

R

Wednesday, 6 June 2012

KS Customers - Examples of Good Practice

We've seen some really innovative use of the KS tools in recent months, from customers in the UK, North America and Australia.  So I wanted to drop a brief note on the KS blog to recognise these firms and highlight the work they've been doing.

In the UK, Jane Carter and the team at BDP have created a suite of modular MicroStation training material, called BDP Bytesize, and made this available to all users via the BDP Intranet, 'Planet'. Jane has mapped the Bytesize training content to the KS assessment meta data and can now link to the relevant training material, as part of the overall test results feedback.  The next step in Jane's plan, is to adopt the same approach for their Revit training program, using the modular content written by the team at White Frog (www.whitefrog.co), as their Bytesize training library for the three disciplines of Revit.  Jane has created a new instance of the KS Revit Architecture assessment, by combining the standard KS 'fundamentals' test, with the KS Community White Frog question library.

In Australia, Dan Jurgens and his team at Cox Architecture have created a custom Revit test, by spending a considerable amount of time reconfiguring all of the Revit content available in the KS library.  This includes questions from the KS 'fundamentals' test, Revit Project Process, Revit Content Creation, KS Community Revit Process and workflow, plus some original questions written in-house.  The result is a challenging assessment for all Cox users, across 6 studios, which is creating a meaningful performance benchmark, together with a detailed training plan for all staff.

In Canada, Bruce McCallum and the learning and development team at DIALOG, are creating a powerful in-house assessment and training environment.  DIALOG was looking for an effective way to get more training completed for less time and cost, and partnered with Global e-Training (www.globaletraining.ca) and KnowledgeSmart to create a customized learning solution incorporating in-depth online skills assessments to determine skills gaps, combined with e-Training modules focusing on the areas that need improvement – specific to each employee. We are mapping the KS meta data for all Revit assessments, to the corresponding learning content, held in the Ge-T LMS.  Bruce is also submitting a paper for consideration at AU 2012, describing the process and the challenges his team has faced along the journey.

R

KS Exports Update


We achieved an important strategic milestone last month, further evidence that our growth strategy for international markets is on the right track.  In April, overseas sales outpaced domestic sales for the first time. With accelerating adoption in all overseas markets, high renewal rates, and a doubling of our subscription business in North America over the past year, KS has reinforced its position as the preferred supplier for online software skills gap analysis and benchmarking services to global architecture, engineering, and construction (AEC) communities.

This is really good news for the global KS user base. As 100% of our development roadmap is user driven, with expanded inputs from subscribers in North America, Australasia and the Middle East we are confident that we have the right focus in our future technology roadmap.

This international growth coincides with our new channel strategy to partner with experienced resellers overseas. Already this year KS has signed partner program agreements with Kelar Pacific (www.kelarpacific.com), an Autodesk Gold Partner with offices in Orange County, San Diego, and Los Angeles; Initial.AEC (www.initialaec.com), an Autodesk software reseller and authorized training centre based in Denver, Colorado; AEC Systems (www.aecsystems.com.au), one of Australia’s largest Autodesk resellers with offices in Sydney, Melbourne, Perth and Brisbane; Cadgroup (www.cadgroup.com.au), an Autodesk Platinum Partner, with offices in Sydney, Adelaide, Brisbane and Perth; and BIMES (www.bimengineering.com), a BIM services consultancy with offices in Dubai, Cairo and Alexandria.

Our goal is to sign a further 7 partners in the UK, S Africa, USA and Canada, in the months ahead.  We are already in discussion with several candidates, who are a good fit for us.  Our focus is quality, rather than quantity, in this endeavour.

“KnowledgeSmart is starting to turn heads on the west side of the Atlantic,” wrote ENR Construction’s Senior Editor and industry veteran, Tom Sawyer, in a recent article. He concluded, “BIM managers are enthusiastic about its independent online software skills-gap analysis and industry benchmarking service.”
After speaking with a number of KS subscribers including Michael Horta, BIM director at Kasian Architecture and Interior Design in Vancouver, and David Spehar, corporate BIM leader at Stantec in Ohio, Sawyer was convinced. Horta told ENR "I haven't found anything similar. They have the system down pat. I can customize and modify the test, and they collect the metrics and do the analysis. I can evaluate which city has the best users, which city has problem areas and [then] modify training to be very specific to them."

Spehar told ENR why he has expanded his firm's subscription to cover thousands of users. "It's a great tool, it's very important that the service is software-vendor-neutral, too." Spehar says. "We run so lean we needed a tool that doesn't require high technology and support overhead. KnowledgeSmart is easy to get in, create assessments, gather results and analyze them.”

With BIM adoption on the rise, and with more demanding clients requesting BIM as a project deliverable, AEC teams are investing more money in online software skills gap analysis and benchmarking services to differentiate their team and win more work.

R

Attachment F assessment

KS was invited to attend a web meeting last week, with the USACE BIM Committee, to talk about the work we have been doing on our 'Attachment F' BIM Process assessment.


Here is the intro text from the session:
"KnowledgeSmart has created a skills assessment, based on the USACE 'Attachment F' BIM Requirements.  This is designed to help USACE, USACE customers, other Federal agencies, and AEC firms to improve understanding and adoption of correct BIM methodology across their supply chain.  It can also help to identify any obvious cases of lack of clarity in AttF messaging, which would be picked up in the subsequent results data. Another benefit being that AEC firms can demonstrate their AttF credentials as part of their project proposals (i.e. if they have a number of team members who have scored well on the assessment, they could include this data in their supporting documentation).
The AttF library of questions is a work in progress and will be updated on an occasional basis, as the standards themselves continue to evolve.  The assessment will be made FREELY available to individuals and firms working on USACE projects".

On a related note, last month, at BIM Show Live, Jacobs' Director of Project Technology, Shawn Foster, delivered an excellent dissection of US Army Corps BIM Requirements and the PxP (Project Execution Plan).  See link for Shawn's bio.

After last week's meeting, we received feedback that the concept met with the committee's general agreement in principle, and USACE is setting up a working group, to complete the editing of the current set of questions and agree on how to best position the material, before the wider US Army Corps supply chain.

Still very much a work in progress, like many firms' personal BIM journey, in many ways.  We'll keep you posted, as this project develops.

Click here for a link to the latest version of AttF.

R