Skills assessments generate a lot of useful data. We have always been in the business of gathering information. And we have thousands of records to analyse. A key theme for KS this year is discovering new and interesting ways to anonymously present the findings of our global data capture.
As well as test topic, score & time values, we also capture optional additional background information about KS users, including:
State (NB if US/CAN/Aus selected in 'Country' field)
Self rating (1-5)
How many years have you used BIM/CAD/Engineering software?
How often do you use BIM/CAD/Engineering software?
How did you primarily learn to use BIM/CAD/Engineering software?
Where did you first learn to use BIM/CAD/Engineering software? (Country/State) What BIM/CAD/Engineering software do you regularly use?
Please specify any other software.
Our goal in the coming months is to create a series of topical benchmarking stories from the information captured.
Average Revit Architecture test score/time for Architects based in California, using the software for 5 years or more, self-taught, part-time users, who learned in the USA.
Average Revit Structure test score/time for Structural Engineers based in NSW, Australia, using the software for 2-5 years, formally trained, full-time users, who learned in Australia.
Average MicroStation score/time for self-taught vs formally trained users.
Average AutoCAD score/time for < 5 year users vs > 10 year users.
And so on.
As you can see, there are dozens of possible permutations for interesting themes and stories buried in the data. We just need to analyse and identify the best ones!
Here is one of our inspirations for creating amazing infographics:
Information is Beautiful by David McCandless.
Over the coming months, we'll share with you the most interesting stories, data-mined from our industry-leading skills assessment software.
Wednesday, 22 April 2015
Sunday, 29 March 2015
A skills matrix, or competency framework, is a table that displays all required skills, tasks or abilities for a team or organisation, in one easy to view interface.
A training/competency matrix is a tool used to document and compare required competencies for a position with the current skill level of the people performing the roles. It is used in a gap analysis for determining where individuals have important training needs and as a tool for managing people development.
• Provides a comprehensive list of key skills the company has identified as worth cultivating.
• Aids in managing training budgets because it identifies skill gaps across your organization.
• Assists with planning by helping identify and target new skill areas that you might need.
• Helps managers with development planning by identifying required skills.
• Allows managers to add skills to employee profiles and set, monitor and change their level of expertise.
• Allows employees to view the skills and skill levels their manager has attached to their profile.
• Provides managers with clear areas for employee development that are directly connected to the organization’s identified HR priorities.
• Enables managers to automatically include skills in review meetings.
Irrespective of what labels or names these levels are given, basically the levels mean: beginner level, learner level, skilled level, expert or master-skill level.
How to Build a Skills Matrix
1. List the key roles in your organization.
2. List the competencies required for each role.
3. List the names of each individual for each role.
4. Determine how you want to code the matrix to indicate skill level, training needed, etc.
Possibly use a colour coding system where:
Red = No skills in this area (1)
Amber = Partly trained in this area (2,3)
Green = Fully trained in this area (4,5)
Blank = N/A
5. Fill out the matrix for each person to indicate current skills and any skill gaps.
6. Look for patterns, opportunities, and areas of need.
In our summer release, we will be adding skills matrix functionality to our user pages area, enabling KS customers to create an easy to access database of searchable user skills and abilities.
Friday, 13 February 2015
The most popular KS wishlist request in recent months has been to make it easier for individuals and admins to view assessment history and results data on a per user basis.
This also ties in with our wider goal to make the KS assessment tools more integrated with other Learning and HR systems, where data is shared across two or more platforms.
So we have created a new personal dashboard for individual users to log in and monitor their personal KS history, within their current organisation.
This resource will make it easier for individuals to manage their KS user profile, invite history, learning evidence and assessment results, including benchmarking data. Annual appraisals and performance reviews will be enhanced by the ability to capture multiple HR, training and learning resources in a single place.
The dashboard login page now has a new option, called My KnowledgeSmart.
Here is the KS user homepage, split into 4 main areas: My Details, My Achievements, My Assessments and My Scores.
The My Details page allows users to set their own password, edit personal information and update the 5 datafields which KS admins use to filter wider user and results data. They can also add a picture, to personalise their KS page.
The My Achievements page offers an opportunity to upload any useful documents and records relating to their personal development and learning history. For example, training completion certificates, exam results, certification evidence, CPD records, resume or professional record of achievement data.
Users can make documents 'public' or 'private' and create an individual URL for sharing learning resources online:
Clicking on the 'Rosette' icon displays a new certificate, which users can share using social media or via email.
The My Scores page is a place for users to view all KS scores in one place.
Available data includes: benchmark comparisons, performance charts and skills gap summaries.
KS admins can access individual user pages via their main KS dashboard > Users page. There is a new icon which opens up the user page in a separate window.
KS admins can control access to user pages on a per user basis:
KS admins can also control user page access on a per account basis:
KS admins can also disable the 'sharing' tool, which appears when the certificate is viewed:
If a user record is deleted or set to 'Ex-employee' status, their personal page will no longer be accessible. User data is not portable, from one company to another. KS data is account-specific and owned by the organisation, not the user.
We anticipate this area of the KS system developing further, as the year progresses. If you have any ideas for integration with other systems, or if you have suggestions for new tools, charts, reports, and so on, please let us know and we'll add them to the summer wishlist.
Sunday, 18 January 2015
KnowledgeSmart skills assessment software helps firms of architects and engineers to measure software capability.
We have created an extensive range of assessments, covering the most popular design, engineering and BIM software titles, used by firms working in the global construction industry.
Assessments cover a range of popular vendors, including: Autodesk, Bentley Systems, Adobe, McNeel, Trimble and Graphisoft.
We work with a global network of world class authors. They are subject matter experts, published, thought leaders and working as consultants to industry, so they understand the practical application of the tools, as well as the menus and mouse-clicks. Popular KS authors include: CASE Inc, Chris Senior, Darryl McClelland, Eric Chappell, Envision CAD, Evolve, John Evans, Josh Modglin, Joel Harris, Paul Aubin, Paul Woddy, Revit Factory, Robert Manna, Scott Moyse, Scott Onstott, Tony Tedder, Thomas Weir and White Frog.
For our most popular assessments, we tend to create a variety of different modules, covering a range of different levels and abilities.
The current number one topic in our library is Autodesk Revit Architecture. Here is a brief list of assessments available in our 'RAC Bundle'. All of the following titles are available, as part of a KS Professional 1 license:
Revit Architecture for occasional users
Revit Architecture fundamentals
Revit Architecture – Xpress
Revit Architecture for Interiors
Revit Architecture advanced
Revit Content Creation
Revit Project Process
Revit Architecture – Extra qu's 1
Revit Architecture – Extra qu's 2
KS Community – Revit Architecture (White Frog)
KS Community – Revit Process & Workflow
KS Community – BIM Management (NB this one is software neutral, but included in the bundle)
So how does it all work? In simple terms, it is a web based, practical test of knowledge and ability, so users will need access to a copy of the software, for which they are being assessed. They are presented with a series of task based and knowledge based questions, entering answers into the KS browser screen. At the end, a summary report with detailed feedback and coaching notes can be displayed to the user. KS system administrators can access detailed reports, highlighting results, benchmark statistics and training recommendations.
Let's take a look at an actual example of our Revit Architecture fundamentals assessment, in this short video:
KS customers have full editorial control over all KS library material, so they are ultimately in the driving seat, as far as the material that is presented to their teams goes. And they have the option of writing custom question from scratch, using the KS authoring tools, if they want to capture knowledge on in-house work flows, standards and processes.
Sunday, 14 December 2014
This year, KnowledgeSmart and AUGI again teamed up to provide an interactive skills assessment, across 10 popular Autodesk software tracks: 3ds Max, AutoCAD, AutoCAD Civil 3D, AutoCAD Plant 3D, Inventor, Navisworks, Revit Architecture, Revit MEP, Revit Structure and Vault.
Once again, we had some great prizes up for grabs. Track winners walked away with a $100 USD Amazon voucher (a combination of high scores and fastest times determined the winner of each track), a glass trophy and the overall winner won a HP laptop and free pass to AU2015.
We spent several hours on Tuesday, setting up 16 networked PC's in the exhibition hall, at the AUGI stand.
Next, 144 copies of Autodesk software needed checking, then all the KS sample data sets had to be uploaded to each machine and checked again. Big thanks to Tony Brown for his assistance in accomplishing this task.
Here's how we looked when everything was finished:
The main competition ran over 3 days (1 x 3 hour slot on day one, 2 x 3 hour slots on day two and a final 1 hour slot on day three). Contestants had to answer 8-10 questions, using the 2015 version of each software title. Each session was limited to just 12 minutes, so people had to work fast! Special thanks to awesome AUGI team members, Kristin, Bob, Michael, Donnia and Richard, for briefing users and making sure everything ran smoothly.
Here are the Top DAUG's in action:
By the end of the contest, we posted 215 results.
Throughout the competition, we displayed a rolling list of the top 10 contestants for each track, on the AUGI big screen.
Congratulations to the following contestants, who won their respective tracks:
And a special mention to the overall winner of AUGI Top DAUG 2014:
So, let's take a detailed look at the results of this year's Top DAUG competition.
No. of Tests Completed: 215
Overall Average: 59% in 9 mins 14 secs
(NB the average score for 2013 was 48%).
Track 1 - 3ds Max
Track Winner: Herb Wagner
Winning Score: 55% in 11 mins 35 secs
Top 2 Contestants:
No. Completed: 2
Group Average: 35% in 11 mins 45 secs
Track 2 - AutoCAD
Track Winner: Tracy Chadwick
Winning Score: 100% in 3 mins 50 secs
Top 10 Contestants:
No. Completed: 68
Group Average: 63% in 8 mins 2 secs
Track 3 - AutoCAD Civil 3D
Track Winner: Christopher Fugitt
Winning Score: 88% in 5 mins 55 secs
Top 10 Contestants:
No. Completed: 21
Group Average: 66% in 9 mins 12 secs
Track 4 - AutoCAD Plant 3D
Track Winner: Patrick Alonzo
Winning Score: 72% in 11 mins 30 secs
Top 4 Contestants:
No. Completed: 4
Group Average: 52% in 10 mins 38 secs
Track 5 - Inventor
Track Winner: JD Mather
Winning Score: 81% in 11 mins 25 secs
Top 10 Contestants:
No. Completed: 21
Group Average: 41% in 11 mins 9 secs
Track 6 - Navisworks
Track Winner: Michael Doty
Winning Score: 60% in 6 mins 0 secs
Top 10 Contestants:
No. Completed: 13
Group Average: 49% in 9 mins 45 secs
Track 7 - Revit Architecture
Track Winner: Erik Eriksson
Winning Score: 99% in 10 mins 40 secs
Top 10 Contestants:
No. Completed: 47
Group Average: 57% in 10 mins 51 secs
Track 8 - Revit MEP
Track Winner: Ken Fields
Winning Score: 100% in 9 mins 15 secs
Top 10 Contestants:
No. Completed: 24
Group Average: 64% in 7 mins 32 secs
Track 9 - Revit Structure
Track Winner: Rebecca Frangipane
Winning Score: 85% in 8 mins 30 secs
Top 9 Contestants:
No. Completed: 9
Group Average: 70% in 9 mins 51 secs
Track 10 - Vault
Track Winner: Bryson Anderson
Winning Score: 60% in 6 mins 50 secs
Top 6 Contestants:
No. Completed: 6
Group Average: 53% in 6 mins 17 secs
Most Popular Tracks
The most popular tracks, in order of completed tests, were as follows:
AutoCAD - 68 results
Revit Architecture - 47 results
Revit MEP - 24 results
Inventor - 21 results
AutoCAD Civil 3D - 21 results
Navisworks - 13 results
Revit Structure - 9 results
Vault - 6 results
AutoCAD Plant 3D - 4 results
3ds Max - 2 results
Total = 215 results
The following people achieved a maximum score of 100% in their track:
Tracy Chadwick - AutoCAD
Glen Sullivan - AutoCAD
Christine Noble - AutoCAD
Ken Fields - Revit MEP
(Plus Erik Eriksson scored a tantalisingly close 99% in the Revit Architecture track!)
The following people demonstrated impressive versatility by achieving above average scores in multiple tracks:
Chris Brown - AutoCAD, Inventor and Navisworks
Tracy Chadwick - AutoCAD, Revit Architecture, Inventor and Vault
Seth Cohen - AutoCAD and Civil 3D
Jason Dupree - AutoCAD and Inventor
Ken Fields - AutoCAD and Revit MEP
John Fout - Navisworks, Revit Architecture and Revit Structure
Rebecca Frangipane - Navisworks, Revit Architecture and Revit Structure
Brent McAnney - AutoCAD, Civil 3D and Vault
Kate Morrical - AutoCAD and Revit Structure
Mohaimie Mosmin - AutoCAD and Inventor
David Rushforth - Navisworks, Revit Architecture and Revit MEP
Maxime Sanschagrin - Revit Architecture, Revit MEP and Revit Structure
Jim Swain - AutoCAD and Inventor
Nick Tanner - AutoCAD, Navisworks and Revit Architecture
Boys vs Girls
Finally, let's take a look at the demographic breakdown of the competition. Out of 215 contestants, 185 were male and 30 female. The average overall performance for each group breaks down like this:
Girls: 64% in 9 mins 25 secs
Boys: 58% in 9 mins 11 secs
So, that's Top DAUG finished for another year. A thoroughly enjoyable 3 days at AU2014. 215 completed tests, across 10 popular tracks. Once again, the overall standard was extremely high, with some outstanding individual performances from our track winners and top 10 contestants.
Congratulations to all our winners. Thanks to the AUGI team for their support. Lastly, a big thank you to everyone who took part in this year's contest.
See you all in 2015 at the Venetian for more fun & games!