Friday, 21 April 2017

How To Explain Skills Assessments To Your Teams

A common question asked by KS customers is how to present the topic of skills testing to your teams.

Here is an example of a recent customer mail:

"Do you have experience with encouraging employees to take the assessments? We want to encourage staff without being off-putting or making it seem like the point of the exercise is to determine their tenure within the organization.

What do you think is the best approach to engage our staff? Do you give it a different name besides assessment?

Your assistance is greatly appreciated."



Here is our reply...

The key to your communication is to emphasize the 'What's in it for me?' aspect of the assessments. No-one likes to take tests. As human beings, we'll avoid it where at all possible! But, conversely, people are curious as to how well they are doing. And they want to improve. Also, they don't know what they don't know. Or how they measure up to their peers and wider industry. So you need to focus your message on the training needs side of the learning path. You can't take them on a focused training journey, if you don't know their starting off point.  And you can't assume everyone knows the basics, therefore deserves advanced training. They need to prove it.

When you have a clear firm benchmark, you can also hire more confidently and intelligently. Which is good news for incoming staff and existing teams.

So don't make the assessments optional. This needs to be managed and mandatory, but with a light touch. And the results need to be analysed and fed into a relevant training plan.

Also, people have a natural fear of failure. So make it clear that nothing bad will happen if they screw up. They don't need to feel apprehensive or nervous. This is about positive improvement, not picking on people who happen to have gaps in their knowledge.

And make it clear that the results are confidential. Only to be accessed by people with a position of responsibility for leadership and staff development.

Here is a useful User FAQ guide, which will help you shape your communications in this space:


WHAT IS KNOWLEDGESMART?
KnowledgeSmart is a web-based skills assessment tool, which enables architectural and engineering practices to measure the software skills of their teams. Since 2003, by carrying out in excess of 30,000 individual assessments globally, a helpful independent benchmark has been established for basic proficiency, across a range of subjects.

Here is a microsite which provides helpful industry benchmark data for our Top 5 skills assessment topics: http://www.2dim4bim.com.

The KnowledgeSmart tools help to identify skills associated with design and engineering software applications, such as AutoCAD, Civil 3D, Plant 3D, Revit, Inventor, MicroStation, ProjectWise, 3ds Max, ArchiCAD, Primavera P6, Adobe Photoshop, Adobe InDesign, SketchUp and Rhino. The process aims to assess current performance in order to identify areas where individuals can benefit from additional training.


WHAT IS THE POINT OF KNOWLEDGESMART AND WHY SHOULD I DO IT?
This process is not about a big-brother style test to check up on unsuspecting users. Nor is it a matter of pointing out who is the more knowledgeable software user. Rather, the point of the exercise is to highlight how much each individual currently knows – in order to identify those areas where focused training can improve basic knowledge.

Everyone uses software in a different way. Some people use these tools full time in their role; others may only use the tools from time to time. The objective is not to teach a veteran user how to perform basic tasks. But the fact remains, a more detailed knowledge of basic commands and operations can alleviate apprehension in using technical software applications and assist in speeding up routine tasks.

IS IT THE SAME ASSESSMENT FOR EVERYONE?
There are a number of assessment modules to choose from, created by a team of expert authors. Firms can also create their own material in-house. For Revit skills, as an example, there is a choice of material, ranging from introductory level, all the way up to more advanced concepts. Summary reports take into consideration whether an individual is a full time user, or someone who only uses the software occasionally as part of a wider role. Follow-up comparisons and training recommendations should be reflected accordingly.

WHAT KIND OF TOPICS DOES EACH ASSESSMENT COVER?
KnowledgeSmart assessments typically break down complex technical software applications into a series of stages or modules.
For example, the fundamentals assessments (for AutoCAD & MicroStation) cover a range of basic commands, including; basic object/element creation, layers/levels, blocks/cells, annotation, referencing & printing, UCS/ACS and preferences.

Assessments typically comprise 20-30 questions and present scores as a percentage, based on the accuracy of the answers presented. There are 2 question types and 9 question styles. Question types are either task-based (i.e. you need to perform a task using the software in order to work out the answer) or knowledge-based (i.e. you are being evaluated on your understanding of certain software functions, without having to actually use the tools). Question styles include; Free text, Multiple choice, Pick list, Order list, True or false, Complete the blank, Matching list, Matrix and Essay.

WHAT IF I DO NOT SCORE VERY WELL?
This exercise is not about who is the best. You will identify areas where you can improve your own performance with the help of more targeted training workshops.
Following the assessment everyone receives an overall result and summary report with feedback on dropped marks, training recommendations and coaching notes.

WHEN AND WHERE WILL THE ASSESSMENT TAKE PLACE?
Your administrator will have details of your assessment program. If you are unavailable to make the date or time arranged please contact your administrator to reschedule.

HOW LONG DOES IT TAKE?
There is no limit to how long the exercise will take. The average time is approximately an hour, but this is not a strict rule.
Try to work at your usual speed. Don’t rush, as many simple mistakes are made by people not reading the instructions carefully. Conversely, don’t take too long to complete each task, as your score plus time taken will be looked at in the overall results analysis.

WHEN DO I GET MY RESULT?
You will receive your score straight away when you finish the assessment. You may also be emailed with a link to your assessment report, including feedback on any marks which have been dropped.

ADDITIONAL INFORMATION
The amount of time you take will not affect the accuracy of your score.
Before you start, you may be presented with a zip folder containing files which you’ll need to refer to during your assessment. Save the zip folder locally on your machine and extract the sample data sets from the zip. And remember where you save it – as you will need to navigate back to this folder during your assessment!
Please read the written instructions carefully for each question. This is also an assessment of your ability to follow instructions. Often, marks are dropped through candidates simply not doing what they have been asked to do.
During your assessment session, you can answer the questions in any order, using the KnowledgeSmart question navigator. You can also edit your answers during the session and review your work once more at the end, before you select Finish.
The time taken to complete an assessment varies depending on the individual, but an average is about an hour. You don’t have to complete the assessment in one sitting; you can log out and return later if you prefer. When you resume, your previous answers will have been saved.


R

Friday, 31 March 2017

New Test Title - Oracle Primavera P6

KnowledgeSmart's extensive skills assessment library helps AEC businesses capture evidence of individual and corporate skills capability, across a wide variety of essential technical software titles.
This includes popular software products from major vendors, including Autodesk, Bentley Systems, Graphisoft and Adobe. And now, for the first time, we have added Oracle to our stable of vendors.



One of the most popular requests in recent months has been to create assessment content for Project Management software. Starting with Oracle's Primavera P6 tool.



This assessment looks at knowledge of using Primavera P6, across a range of core functions.

Modules covered include:

- Starting Up and Navigation
- Creating a New Project
- Calendars
- WBS Nodes
- Creating Activities
- Formatting
- Relationships
- Constraints
- Layouts
- Filters
- Scheduling/Setting a Baseline
- Updating an Unresourced Schedule
- User Preferences
- Creating Resources
- Assigning Resources
- Managing Resources
- Organising Project Data
- Security

This set of questions has been written by our friends at Prima Skills and goes live in April.

R

Saturday, 25 February 2017

KS Wishlist update

We receive a regular flow of new feature ideas from our customers in different parts of the world. The best ideas find their way onto the KS software update wishlist and then, in turn, into the live site.

Here is what we've been up to in recent months...

Update survey Invite > History page with new options for managing invites.


Add a new method to quickly import user list directly onto the invites page (for both tests and surveys).  Add new option to revert to default invite text.


You'll find a formatted csv template in the help notes box on the right hand side of the page.


Add more options for managing survey results data.





Tidy up logic on invite re-send tool, where expiry dates have been amended.

Tidy up csv export for dynamic groups in results page.

Tidy up logic to prevent creation of blank groups in users and results pages.

Change presentation of results graphs. Split into two sections: 'Performance' and 'Comparisons'. Add new section to display charts for exam results.


Change results csv export to include exam results, as well as assessment results.

Include option to retrospectively apply exam grade to legacy test results. Swap gap analysis report into exam certificate on user page. Move result from My Assessments to My Exams.


Allow admins to send a link to an exam certificate and/or a gap-analysis report to users who have completed an exam.


Allow admins to choose default landing account when logging in.


Add number of results to Results > Data page.  Add user count to footer of training requirements chart.





Coming soon..

Here's a brief summary of what we're working on in the next 6-12 months..

Test UI refresh - new mobile responsive design for the KS test-taking interface
Test report refresh - new design for KS test report
New charting options for the KS Query Tool
Use of CDN for managing test files and downloads
New KS support tool
Tidy up system mails logic (for invite, reminders and results mails)
Bulk upload of questions in library > draft questions area
Option to set an expiry date on the main survey invites page
Reminder mails option for survey invites 
Add dynamic groups to survey invite > history
If a group is created in Users (static or dynamic), include option to display it in Results automatically.
Auto reports - Email summary report of KS account activity for admins
Searchable skills matrix
Reporting and charting templates


R

Thursday, 16 February 2017

Case Study - Skills Assessment at David Morley Architects

David Morley Architects has partnered with KnowledgeSmart and Evolve Consultancy to identify gaps in its team’s CAD and BIM expertise and provide the targeted training needed to raise benchmarks across the firm.

Click here for a detailed case study, recently published in AEC Magazine, about the benefits of an enlightened, forward-thinking, process-driven, tech-savvy, BIM-capable business.

This case study looks in detail at skills assessment, benchmarking, technology, data, metrics, relevant consultancy and focused training.  An excellent example of good practice in the AEC industry.

R

Tuesday, 7 February 2017

New Test Title - Solibri Fundamentals

KnowledgeSmart's extensive skills assessment library helps AEC businesses capture evidence of individual and corporate skills capability, across a wide variety of essential technical software titles.

This includes popular software products from major vendors, including Autodesk, Bentley Systems, Graphisoft and Adobe.

One of the most popular requests in recent months has been to create a question set for Solibri Model Checker.

This assessment looks at knowledge of using Solibri in a BIM workflow for model validation, review and interference resolution. The assessment requires Solibri v9.7.

Modules covered include:

Solibri-User Interface
Solibri-Model Review
Solibri-Data Interrogation
Solibri-Model Checking
Solibri-Model Comparison

This set of questions has been written by our friends at Evolve Consultancy.

R

Thursday, 12 January 2017

20 Key BIM Terms

Sometimes AEC businesses struggle with the plethora of BIM conferences, seminars, white papers, 'expert' consultants, Government advice, conflicting terminology and official guidance.

BIM is gaining momentum and maturity across the Global Construction industry.

Written a couple of years back, but still very relevant today, we really enjoyed this informative, thoughtful article from NBS, entitled, '20 key BIM terms you need to know'.

https://www.thenbs.com/knowledge/the-20-key-bim-terms-you-need-to-know.

R

Saturday, 10 December 2016

'BIG 5' Benchmark Stats

If your firm increases productivity by 1% per person annually, you gain 2 extra working days per team member.

How do you measure your firm's productivity against your peers and the wider AEC industry?

KnowledgeSmart has been capturing test scores across a wide range of technical software products, used by Architects & Engineers around the world.

We have data-mined thousands of results for the 'BIG 5' Autodesk software titles, as follows:

Revit Architecture
Revit MEP
Revit Structure
AutoCAD
AutoCAD Civil 3D

The results form a fascinating insight into the way these tools are deployed and utilised in different territories around the world. This is the most detailed study of software proficiency ever carried out with live test data, across thousands of users and hundreds of AEC businesses.

A detailed PDF report can be downloaded for each topic from this microsite.

Now, for the first time, management teams can compare their in-house performance and technical software capability against the wider Global industry benchmark.

This data can be included in the hiring process, new team selection, annual appraisals and when bidding for new project work.

R