Friday, 16 March 2012

KS Help Notes_Admin 02_Planning your Assessments


ADMINISTRATOR NOTES FOR PLANNING AN ASSESSMENT PROGRAM


COMMUNICATING YOUR PLANS

It is important to explain to candidates precisely why you are asking them to undertake a skills assessment. You might do this by including the assessment in a wider discussion about training needs and more efficient use of technology within your company, including training needs, recruitment, induction and annual appraisals processes.

Improving technical software skills is a continuously improving process – the KnowledgeSmart tools simply highlight a likely starting point. As with most things, good communication goes a long way. An effective strategy is to hold a team meeting and explain your overall plans for improving performance within the business. This can include a Q&A session, where people have an opportunity to clarify any aspects of the assessment process – and understand how it fits into the bigger learning picture.

It is important that people don’t feel that the results of their skills evaluation may be used in a negative context, particularly if they don’t perform well first time around. Ensure they understand this is a designed to be a learning process – and everyone has to start somewhere! By taking a snapshot of each individual’s current skills at a basic level, you can feed this data into your training plans moving forwards.

This is not all about who is the smartest, fastest, or best Revit user.  Or the most experienced MicroStation guru; although these are useful things to know.  The real value in the KS tools is trying to help users, at an individual level, identify what they know and also what they need to brush up on next, in order to improve their skills, with a particular piece of software.

Since 2003, KS has facilitated over 25,000 individual test sessions, across hundreds of AEC firms.  Experience has shown that, in general, people don't like the idea of taking a test - at least that's usually how they feel at first.  However, experience has also shown that, for most people, when they complete their assessment, they realise that it wasn't half as bad as they thought it would be! Further, they often learn new skills as they go through the process of answering the questions.  And lastly, they find the feedback and coaching notes extremely helpful, in giving them an unbiased appraisal of their skills using a particular software application - relative to their peers in industry.  It's important that users have the opportunity to review their test scores and feedback with their technology leaders or learning & development leaders.  And it's also important to explain to your users that all scores are confidential.  Best practice would recommend that only senior administrators should have access to firm-wide results data.

Provided you explain to your teams that the main point of rolling out an assessment program is to gather valuable training needs analysis data, rather than a big-brother style exercise in peeking over shoulders, then most users feel OK about the idea. And when you follow through with more focused, modular training workshops, which target recognised skills gaps, then it's a win-win for users and management alike.

Evidence shows that better overall understanding of basic commands can also benefit the implementation of in-house standards. Once you have presented to your teams the case for measuring and improving skills, you need to set a timeframe for rolling out your assessments.

PREPARE A TIMESCALE

You need to decide over what period of time you want to complete your assessments. This will depend largely on 3 variables; 1) the number of people to be assessed 2) the number of offices in your organisation and 3) general day-to-day work pressures.
You also need to decide what your preferred assessment environment will be. Some companies prefer their teams to use their own work stations, whilst others favour a more formal setting. In truth, there is no right or wrong answer to this question; you want people to feel comfortable and remove any potential for apprehension or pre-assessment nerves.

A formal assessment environment can be helpful, where people simply take an hour or so, away from their desk, at a specified time, to sit their assessment. This way, they aren’t distracted by project work, email, ringing phones or interruptions from colleagues.

For multi-office firms, it is best to start with one location and, when this office is done, gradually ramp up the level of activity until the assessment program is complete. You will need the support of your regional technology co-ordinators if this is to be a success.

PREPARE YOUR TEAMS

We recommend you brief your teams prior to their assessments. You might like to share with them the KS FAQ document and/or KS user video, or perhaps present your plans at a team meeting.  It is best to circulate the information at least day or two before people are due to take their assessments.

SEND YOUR ASSESSMENT INVITES

Use the inviting tools in the KS admin dashboard to manage your assessment program.  You can send invites individually, or create a longer list of users and invite them in groups.

Next, you may customise your welcome message in the invite mail and on the assessment start page.  You might wish to present users with specific instructions on where to locate their sample files (if you have chosen to store them on your own network, for example).

COMPLETE YOUR ASSESSMENTS

Monitor your teams as they complete their assessments. Administrators receive an email with a link to test results, as and when they are completed.

You can use the invite history page of your admin dashboard to track invites and completed assessments.  You can also re-send invites from here.

AFTER THE ASSESSMENTS

KS administrators can decide how much feedback to present to their users.  At the end of their assessment, candidates may receive an instant result and full summary report with a breakdown of their score, training keywords and coaching notes. In addition, they may be sent an email containing a link to their report.

Account administrators can access all assessment results, feedback, training data and coaching notes – for each candidate – in the KS admin dashboard.

Via the admin dashboard, it is possible to search results, view performance charts and export data into Excel for in-house reports. It is important to provide the opportunity for individuals to access a structured training program, after their skills have been evaluated. The results will highlight areas where future training workshops can be focused for maximum improvement.

It can be helpful to schedule a follow up meeting with KnowledgeSmart, to analyze your results data.  If you work with an external training partner, it is a good idea to ask for their recommendations on post-assessment training activity.

R

KS Help Notes_Admin 01_Checklist


KS ADMIN CHECKLIST


Here is a checklist for KS admins to use, when planning their skills assessment program.

  • Sign KS subscription agreement/renewal
  • Identify Technology, HR, L&D and Management sponsors
  • Agree on your goals; how will you measure your success?
  • Identify KS admins; who will coordinate your assessments?
  • Decide on account hierarchy; one account or several?
  • Schedule KS admin web trainings
  • Decide on your library content; ‘off the shelf’ or customize?
  • Agree on your message to users; clear communication is important
  • Feedback and coaching options
  • Include assessments for new hires at interview?
  • Send out your invites
  • Monitor your assessment activity
  • One-month review
  • Post-assessment review; analyzing your results, charting and reporting
  • Follow up activity and training
  • Post-training assessments; measure improvements 
  • New features?
  • New assessment titles?

Sponsors

The first task is to identify who your main contacts will be across your organisation, who will help support your skills assessment activity. Experience shows that it is helpful to gain support from the following key business areas:

Technology & BIM
HR and Learning & Development
Management

Goals – Measuring your Success

Think about your goals.  What do you want to achieve from your assessment program?  Is it better visibility of training needs? Is it a benchmark for minimum performance? Is it a team by team breakdown? Do you want to compare different offices? Or different countries? What are the most important assessment topics for your organisation? Do you want to make the assessments optional or mandatory? (Historical evidence suggests that it will help your % completion rates if there is a clear message from management, that this is a mandatory exercise).

How will you measure your success?  What training resources do you have access to (in-house or 3rd party)?  Do you have a formal training partner? Do you have access to E-Learning material?  Do you have a LMS or HR database?  Do you have a company Intranet?

Your Account Structure

Think about your KS account hierarchy and any additional administrators who will be responsible for coordinating assessments for their teams or offices.

For example, you can administer your assessments from a single account, or you can set up multiple accounts, representing different offices or business groups, within your organisation.


You can decide which admins will have ‘global’ access to your assessment data and which will have more restricted admin permissions.

Admin Training & Support

Schedule some web trainings for your KS admins. There are a range of help guides available for using the KS tools, which will help you and your team quickly get up to speed.  

It is helpful to schedule a getting started webinar, to walk through the system, answer any questions and share some ideas on good practice for successfully managing your assessment program.  The KS team will be available to help you, as you get up and running.

Library Content

Think about your assessment library.  Do you want to include any custom content, or simply use the KS library unchanged?  Most firms tend to use the material ‘off the shelf’, at least to begin.  Over time, as you become more familiar with the library questions, you’ll likely make some changes and add your own material.

Communicating Your Plans

Think about how you wish to communicate your plans to your teams. Good communication is essential in explaining to your users what you want to achieve. Review the KS guides on managing your assessment program, together with the user Q&A information and video.

How much feedback do you want your users to see at the end of their assessment? Do you want to include coaching notes or links to learning?  Do you want to assess new hires at interview?

Review Your Progress

One month review; schedule an interim update call with KS, to make sure everything is on track. It’s still early days, so it sometimes helps to share any technical issues or queries you may have at this stage.

Sending Your Invites

Send out your invites and commence your assessment program.  Monitor progress and chase up any missed appointments.  The KS team will be on hand to assist with anysupport issues you encounter.

Analyzing Your Results

Post-assessment review; schedule an update call with KS, to discuss progress and analyze your results in detail. Look at options for charting and reporting your data. Discuss skills gaps and training needs. Will you involve a training partner at this stage?

Training

Feed the skills gaps data from your assessments into a continuous improvement program of training, lunch & learns, seminars, E-Learning, mentoring.

Post-Training

Post training assessment. Use the KS tools to measure the effectiveness of your training and how much information people have retained.

Feedback

New features & feedback; please share your thoughts with us on what you would like us to develop or improve in the KS dashboard.  

New library material; do you want us to add any new titles to the KS library?

R

Wednesday, 29 February 2012

KS System - next release

We're just going through the final round of testing on our staging site, before we move the next release of the KS tools to the live site.  The beauty of a web based system is that all KS customers get to enjoy the latest features, as soon as they go live, without worrying about physically un-installing or re-installing software.

A number of firms have asked for a heads up, as to what will be appearing in the March release.  We'll be talking about this in detail at the upcoming KS user group, in London, but for now, here's a brief overview of the new features, plus a look ahead to the point-release currently in development, which rolls out around May.

The main theme of the next release is to introduce account hierarchies, for firms with two or more locations in their organisation.  Large firms, with multiple KS accounts in place, will now be able to view and manage data, across linked accounts.  This impacts on almost every area of the KS admin dashboard.

KS Library

A common request is to enable firms to share custom test material, which has been created in one KS account, with the related accounts in an organisation.  The 'Test Content' pages have been re-branded as the 'Library' area of the dashboard, with a new menu structure now appearing. The 'Your Tests / Modules / Questions' area now appears as 'Draft Tests / Modules / Questions'. When a KS OTS (off the shelf) test is imported, the editable copy now appears in 'Draft Tests'.  The individual editable stages appear in 'Draft Modules' and the editable questions appear in 'Draft Questions'.

When you are happy with your new content, (either material you have written in-house, or an OTS test that has been modified), you now have the option to 'Publish' your modules and tests.  This effectively locks down that content and allows you to share it across linked accounts.  Draft content can not be shared between accounts.  Published modules and tests can be un-published at any time, for further changes.

Viewing Linked Accounts

Another key change to the KS dashboard operation, is the introduction of a hierarchy of linked accounts.  Admins can now view their account structure of 'parent', 'child' and 'sibling' accounts.  For admins with permission to access multiple accounts, a new 'Switch account' feature allows simple navigation between accounts. For viewing data across linked accounts, a new sub-menu dropdown is available in the Library, Invites > History, Results > Data and Users pages.  This sub-menu can be used to drill down to individual accounts, or to view information from all linked accounts, organisation-wide.

Creating New Accounts and Administrators

It is now possible to add new KS accounts to an organisation hierarchy, and also to create new account administrators for each account.  In addition, we have created a new type of admin profile, a Sub-administrator.  Full administrators can assign their preferred status to new admins and also change the status of existing admins.  Sub-admins can only set up KS test sessions from a browser; they cannot access the full admin dashboard. This is particularly useful for delegating the task of setting up test sessions for interview candidates, without the need for a full KS administrator to be available every time.  It also opens up the option for firms to create a general admin profile, which they can make centrally available, for users to 'self-invite' themselves to take a KS test or re-test, via a browser (i.e. skip the need to issue an email invite).

Full admins can also assign account permissions (i.e. decide how high up the organisational hierarchy new admins are allowed to access company data) and flag additional admins to receive system emails (i.e. invite mails, results mails, etc.).

Managing Content

We've added a selection of new tools, which allow Full admins to share KS data, across linked accounts.  For example, moving user records between accounts, moving test results, sharing modules and tests between accounts and applying changes to account settings, across linked accounts.

We have also tweaked the logic, relating to usernames and passwords appearing on test invite mails. Before, this data was only appended to the first instance of an invite. Subsequent invites only included the test name and URL.  Now the username and password is included in every instance of an invite, which will reduce the need for users to re-set their passwords (i.e. if they have lost or deleted the original password).

So, as you can see, quite a few changes to the overall structure of the KS system, but offering greater flexibility and control to customers, to manage data across their organisation.


Here's a sneak peek at what's currently being worked on, from the KS user wish list..

General update of the KS system to a cloud based server environment, including an updated operating system, more memory and greater flexibility for future growth.

Automatic retrospective grouping of new user and results data, based on existing groups.

New charting options and improvements to existing results data. New exporting options for results data. Linking of question footprint information, to Question Performance chart.

New options for admins to select level of precision on free text answers.

Update to the rich text editing tools, for creating and editing questions.

General improvements to the 'Invites' area of the dashboard, including auto emails for expired tests, reminder mails for users and capturing data for Issued/In Progress/Completed tests.

Updated 'Resources' area for admins, on how to utilise the KS tools and manage their assessment program.


This little lot will be finding its way into the live site in the next 3 months.  We'll be discussing all this and more at the KS user group, in London, on April 26.  We will also be mapping out the technical roadmap (sometimes referred to as, the 'Jane List') for the second half of 2012 and beyond.  Themes include; capturing and displaying additional benchmark statistics, personal dashboard pages for users, survey questions, links to training and e-learning providers and links to HR systems and learning management systems.

R

Tuesday, 28 February 2012

KS User Group - April 26, London


We’re hosting the next KS user group meeting, in London, in April.  Here are the details:

Date:
Thursday 26 April

Location:
Smeaton Room, Buro Happold, 17 Newman Street, London

Start:
Tea & coffee at 08.30, for a 09.00 start

End:
Last session wraps up at 13.30

Cost:
Free (breakfast & lunch included)


We will be reviewing the KS tools in some detail, including a look at what’s coming up in the next 3-6 months.  Plus a guest presenter or two.  Full agenda to follow, as the date gets closer.

Places will be limited to 25, on a first come, first served basis.  As of close of business Tuesday 28 Feb, we have just 6 slots remaining.

Hope you can make it!

R

Monday, 6 February 2012

2012 Priorities for AEC Firms

I read an interesting White Paper from Deltek last week, looking at where AEC firms are focusing their energy this year.  Conducted in November and December 2011, Deltek’s survey polled 430 decision makers at primarily US based AEC firms.

Not surprisingly, Marketing & Business Development and Cost Control are the two key themes for AEC firms in 2012.  70% of respondents said that increasing business development is their biggest priority this year.

This is something we are seeing across all regions.  Most AEC firms have now begun to implement a BIM strategy, but the majority are still in the early stages of communicating a joined-up message.  One of the areas the Deltek paper addresses, is how architects and engineers can make the most of Business Intelligence, in order to win more work. 

A trend we are noticing, is the need for AEC firms to demonstrate their BIM credentials.  This is important for a number of reasons.  The more data firms can gather on how proficiently they can deliver on a project, the more confidence they can instil in clients, prospective clients and their supply chain partners. It's one thing to say, 'Oh yes, we can deliver this project to BIM Level 2'.  It's quite another thing to actually be in a position to back up the words with hard evidence. This includes metrics from past or current BIM projects and performance data on individuals in the team, who will be hands on delivering the project information and working alongside the client and the GC.

And when this information is identified, it is vitally important that the business development teams are fully briefed on what it all means.  More and more firms are being requested to complete BIM-readiness questionnaires these days and, as clients become more knowledgeable about BIM, those forms will be requesting ever increasing levels of detail.  The phrase 'BIM Execution Plan' is already becoming a familiar refrain and this is not going to go away.

No longer can AEC firms simply trade on their past reputations.  They need to come up with a new message, one which truly differentiates them from their numerous competitors.  We came into this recession, at the end of 2008, on the back of a near 10-year boom.  We should now be bracing ourselves for a 7-10 year down- cycle, which is currently only 3 years old.  It is those firms who can capture relevant, objective, BIM-related performance data, who will be the real winners in the continued lean years, which lie ahead.

Here is a link to the Deltek paper:
http://www.deltek.com/pdf/whitepapers/AE-Firms-2012-Priorities-WP.pdf.

R

Monday, 9 January 2012

Demonstrating Professional Credentials

A common theme, here at KnowledgeSmart, is that of skilled professionals being able to demonstrate their capability and credentials, whatever the subject.  And it is a topic which is being discussed at the highest levels, here in the UK, as 2012 rolls around.

Back in 2009, I posed the (rather tongue-in-cheek) question; what do bus drivers and welders have in common that CAD & BIM professionals don't?  The answer is simple; bus drivers and welders need to demonstrate their professional credentials on a regular basis throughout their careers, in order to be allowed to do their job.

Bus drivers (and lorry drivers) have to pass a 'certificate of professional competence' in order to drive their vehicles in the UK.  The CPC was developed as a requirement of a European Directive, designed to improve the knowledge and skills of professional LGV and PCV drivers throughout their working life. There are two parts to the legislation:

- An Initial Qualification that must be achieved by new (LGV and PCV) drivers along with their vocational licence to enable them to use their licence professionally.
- Periodic Training, which involves all professional drivers undertaking 35 hours of training every 5 years.

Meanwhile, welder certifications are specially designed tests to determine a welder's skill and ability to deposit sound weld metal. The tests consist of many variables, including the specific welding process, type of metal, thickness, joint design, position, and others. Most certifications expire after a certain time limit, and have different requirements for renewal or extension of the certification. Once a welder passes a test (or a series of tests) their employer will certify the ability to pass the test, and the limitations or extent they are qualified to weld, as a written document (welder qualification test record, or WQTR).

So if the driving industry has a recognised benchmark for performance and if the welding industry won't allow its members to pick up a blow-torch without first demonstrating they have the requisite skills, then why on earth does the AEC industry let just about anyone gain access to a CAD or BIM software license?

I remember reading a story on ENR.com, a while back, highlighting the dangers of unqualified staff working on projects. A warehouse in Philadelphia suffered a major structural collapse, resulting in damages of $3.5M USD. A building-collapse expert concluded, 'The team committed several engineering errors of "amazing proportions" that caused the Philadelphia warehouse to fail under the weight of snow'. He found that the warehouse had only one-third of the steel roof framing it needed.

Someone using analysis software grossly underestimated the quantity of materials, the error went unnoticed, the building was constructed, the snow fell... and they firm got hit with $3.5M in costs. Plus a number of people were hurt when the roof fell in. Ouch! Suddenly a few extra dollars on testing a person's proficiency using engineering software, plus a few hours training, seems like a good use of the company resources!

A CIO at a large US design firm once said to me; “The minute you give someone a BIM license on their desktop – you are effectively giving them full access to your end product. You want to be sure they know what they are doing!”. This firm doesn't allocate a license to someone's desktop unless they have first met the minimum threshold competence level for performance, as identified by the firm's BIM administrators (using KnowledgeSmart, in this instance).

A bus company doesn't give someone the keys to its shiniest double-decker without making absolutely sure they have the current skills required to drive the vehicle. Isn't it time that the AEC industry recognised the need for clear skills certification in core authoring, BIM, Civil and analysis software packages? And a continuous improvement program for design and engineering professionals to demonstrate their credentials to their employers on an ongoing basis?

Well, hopefully, that is precisely what is happening, as the UK Government mandates use of BIM Level 2 on all appropriate construction projects, by 2016.  Despite (or perhaps because of) the ongoing economic hardship around the world, many firms are revisiting their technology assessment and training programmes, to make sure that they can measure the skills of key personnel.  Firms who can demonstrate to clients that they have the requisite skills to deliver a BIM project safely, on time and budget, will have a definite competitive advantage, as the clock counts down to 2016.

R

Thursday, 8 December 2011

AUGI Top DAUG 2011 - The Results



The AUGI Top DAUG competition was introduced at Autodesk University, way back in 2000. The contest originally consisted of two parts, based on AutoCAD 2D knowledge.  This year,  AUGI teamed up with KnowledgeSmart to expand the range of topics to include 7 tracks; 3ds Max, AutoCAD 2D, AutoCAD Civil 3D, Inventor, Revit Architecture, Revit MEP and Revit Structure.

Here is an overview of the contest from AUGI Hot News:
http://augi.typepad.com/files/augi-hotnews-daily---wednesday-nov-30-2011.pdf

There were some great prizes up for grabs, too.  Track winners walked away with a $100 USD Amazon voucher (a combination of high scores and fastest times determined the winner of each track) and the overall winner also won a HP notebook PC and a free pass to AU 2012!

We spent several hours on the Monday, setting up 18 networked PC's in the exhibition hall, at the AUGI stand. All 126 copies of the Autodesk software needed checking, then all the KS sample data sets had to be uploaded to each machine and checked again.  Big thanks to Daniel Heselwood, from Evolve, for answering our call for help.  (We actually called Daniel to ask for his help with a missing xref - and 4 hours later he was still helping us finish up!).

Here's a snapshot of the stand, when we were all set up:


The competition ran over 3 days (2 x 3 hour slots on the first 2 days, with a final hour session on the last day).   Contestants had to answer a range of questions, using the 2012 version of each software title. Each session was limited to just 14 minutes, so people had to work fast!  The new format proved popular, with eager users queuing up to grab a spare seat.  Special thanks to AUGI team members, Bob Diaz and Michael Patrick, for briefing users and making sure everything ran smoothly.

Here are the Top DAUG's in action:


By the end of the contest, we posted 305 results, across 7 tracks.  On a conference venue web connection, with the added ingredients of free beer and wine, we had a completion rate of 99%, which was way ahead of our anticipated 95% target. (Our support team promptly rescued the missing two scores, so all completed tests were captured and logged on the AUGI dashboard).

Throughout the competition, we posted a rolling list of the top 10 contestants for each track, on the AUGI big screen.



The Results

Congratulations to the following contestants, who won their respective Tracks:

3ds Max Track Winner:  James Clarke
AutoCAD 2D Track Winner:  Brent McAnney
AutoCAD Civil 3D Track Winner:  Brian Hailey
Inventor Track Winner:  Joe Bartels
Revit Architecture Track Winner:  Aaron Maller
Revit MEP Track Winner:  David Raynor
Revit Structure Track Winner:  Rebecca Frangipane

And a special mention to the overall winner of AUGI Top DAUG 2011:

Brian Hailey



Analysis

So, let's take a detailed look at the results of this year's Top DAUG competition.

Overall

No. of Tests Completed:  305
Overall Average:  53% in 12 mins 56 secs



Track 1 - 3ds Max

Track Winner: James Clarke
Winning Score: 64% in 13 mins 55 secs

Top 10 Contestants:

James Clarke
Jesse Sandifer
Daniel Heselwood
Paul Mazzoni
Fernando Oliveira
Jens Tange
Matti Oopik
Yosun Chang
Douglas Bowers
Charlie Forsythe

No. Completed: 13
Group Average: 18% in 12 mins 15 secs
Top 10 Average:  20% in 12 mins 2 secs


Track 2 - AutoCAD 2D

Track Winner: Brent McAnney
Winning Score: 100% in 11 mins 25 secs

Top 10 Contestants:

Brent McAnney
Alex Lepeska
Scott Wilcox
Heather Shrieves
Timothy Vaughan
Ben Rand
Decio Ferreira
Jim LaPier
Richard Lawrence
Youssri Salman

No. Completed: 110
Group Average: 42% in 13 mins 43 secs
Top 10 Average:  82% in 12 mins 42 secs


Track 3 - AutoCAD Civil 3D

Track Winner: Brian Hailey
Winning Score: 100% in 6 mins 10 secs

Top 10 Contestants:

Brian Hailey
Richard Lawrence
Travis Winter
Bryan Thomasy
Kirk Noonan
Jeff Nichols
Bruce Klug
Brent McAnney
Charles D'Errico
Bill Neuhauser

No. Completed: 26
Group Average: 61% in 12 mins 33 secs
Top 10 Average:  90% in 11 mins 24 secs


Track 4 - Inventor

Track Winner: Joe Bartels
Winning Score: 80% in 8 mins 20 secs

Top 10 Contestants:

Joe Bartels
Gerrard Hickson
Andrew Warren
Bill Graham
Alex Karan
Marius Minnen
Anders Tokerud
Chris Brown
Scott Wayand
Curtiss Cooke

No. Completed: 27
Group Average: 59% in 13 mins 6 secs
Top 10 Average:  75% in 12 mins 9 secs


Track 5 - Revit Architecture

Track Winner: Aaron Maller
Winning Score: 100% in 6 mins 55 secs

Top 10 Contestants:

Aaron Maller
Brian Mackey
Anthony Tiefenbach
Steve Faust
Sean Darnell
Eric Bernier
Adam Ward
Douglas Bowers
Andrew Fisher
David Ivey

No. Completed: 84
Group Average: 79% in 11 mins 35 secs
Top 10 Average:  99% in 10 mins 7 secs


Track 6 - Revit MEP

Track Winner: David Raynor
Winning Score: 80% in 13 mins 25 secs

Top 10 Contestants:

David Raynor
John Karben
Fernando Oliveira
Jason Vaia
Philip Charlson
David Rushforth
Clifford Baker
Douglas Bowers
Maxime Sanschagrin
Paul Beseman

No. Completed: 23
Group Average: 54% in 13 mins 50 secs
Top 10 Average:  67% in 13 mins 49 secs


Track 7 - Revit Structure

Track Winner: Rebecca Frangipane
Winning Score: 98% in 13 mins 45 secs

Top 10 Contestants:

Rebecca Frangipane
Andrew Lawrence
Dezi Ratley
Brian Mackey
Eric Bernier
Matthew Hill
Tina Bos
Michael Patrick
Jason Rhodes
Fernando Oliveira

No. Completed: 22
Group Average: 56% in 13 mins 32 secs
Top 10 Average:  77% in 13 mins 5 secs




Further Analysis & Observations

Popular Tracks

The most popular tracks, in order of completed tests, were as follows:

AutoCAD 2D - 110 results
Revit Architecture - 84 results
Inventor - 27 results
AutoCAD Civil 3D - 26 results
Revit MEP - 23 results
Revit Structure - 22 results
3ds Max - 13 results


Training Needs Analysis

So what does all this mean, in terms of performance and training?

For 3ds Max, the top 10 training items were, in priority order:
Creation Parameters, Cross Sections, Map Channels, Map Coordinates, Object Data Flow, Object Modifiers, Object Properties, Pivots, Rotate and Space Warps.



For AutoCAD 2D, the top 10 training items were, in priority order:
Attaching References, Modifying References, Dimensioning Objects, Measuring, System Variables, Object Snaps, Arcs, Drafting Tools, Lines and Tangent.



For AutoCAD Civil 3D, the top 10 training items were, in priority order:
Changing a Slope, Pipe Networks, Sections, Corridor Surfaces, Corridors, Breaklines, Surfaces, Survey, Alignments and Parcels.



For Inventor, the top 10 training items were, in priority order:
Parameters, Tolerances, Features, Revolve, Sketch, Hole, Project Settings, Styles, Components and Parts.



For Revit Architecture, the top 10 training items were, in priority order:
Clipboard, Data Integrity, Modeling, Family Editor, Families, File Management, Worksharing, Room Tags, Walls and Annotation.



For Revit MEP, the top 10 training items were, in priority order:
System Inspector, Apparent Load Values, Creating Power Systems, Editing Properties, Modifying Light Fixtures, Panel Circuits, Panel Schedules, Photometric Web Files, Selecting Equipment and Electrical Settings.



For Revit Structure, the top 10 training items were, in priority order:
Families, Scope Boxes, Types, Visibility, Roofs, Floors, Slabs, Structural Deck, Annotation and Span Direction Symbol.



Revit Data

It's interesting to compare the Revit results, across the 3 disciplines, which arguably reflects where Revit currently sits, in terms of wider industry adoption.  Revit Architecture proved to be the most popular track and the overall experience of the users taking part in Top DAUG, suggests that this software is the most mature, in terms of individual expertise.  Revit MEP and Revit Structure were very close, in terms of overall performance, but considerably behind Revit Architecture, on a like-for-like productivity comparison.

RAC overall:  79% in 11 mins 35 secs
RMEP overall:  54% in 13 mins 50 secs
RST overall:  56% in 13 mins 32 secs


It's also worth noting that the average Revit Architecture score, for the top 10 users, was an impressive 99% in 10 mins 7 secs.  In fact, 31 users out of 84 posted scores of 90% or higher, which is outstanding.

As an additional comparison, we posted 28 results for Revit Architecture at RTC USA, earlier in the year.  The overall average for Revit Architecture, at the RTC 'Top Cat' contest, was 79% in 9 mins 7 secs. (NB the questions were different at RTC, but the level of difficulty was on par with Top DAUG).


Range of scores

Interestingly, across the 7 tracks, we saw scores ranging from 0% to 100%.  Here is a summary of both ends of the performance scale:

7 x 100% scores (1 x AutoCAD 2D, 1 x AutoCAD Civil 3D, 5 x Revit Architecture).
13 x 0% scores (1 x 3ds Max, 11 x AutoCAD 2D, 1 x Revit Architecture).


Honourable mentions

Along with our track winners, the following contestants deserve a special mention, for their performance in the competition:

Brian Mackey - winner of the RTC USA Top Cat contest in the summer, Brian placed second in the Top DAUG RAC track (scoring 100%), and also placed in the top 4 for RST.  An exceptional effort!

Brent McAnney - won the AutoCAD 2D track and placed in the top 8 for AutoCAD Civil 3D.

Eric Bernier - placed inside the top 6 for both the RAC and RST tracks.

Douglas Bowers - placed inside the top 8 for both the RAC and RMEP tracks.

Fernando Oliveira - placed inside the top 10 for the 3ds Max, RMEP and RST tracks.

Richard Lawrence -  placed inside the top 10 for both the AutoCAD 2D and AutoCAD Civil 3D tracks.

Heather Shrieves - placed inside the top 5 for AutoCAD 2D and top 12 for Inventor.

Jason Vaia - placed inside the top 5 for RMEP and top 18 for RAC.


Boys vs Girls

Finally, let's take a look at the demographic breakdown of the competition. Out of 305 contestants, 271 were male and 34 female.  The average overall performance for each group breaks down like this:

Female:  61% in 12 mins 37 secs
Male:  53% in 12 mins 57 secs



So, there we have it! A thoroughly enjoyable 3 days at AU2011. 305 completed tests, across 7 popular Autodesk software applications.  It's fair to say that the overall standard was extremely high, with some truly outstanding individual performances from our track winners and top 10 contestants.  

Congratulations to all our winners.  Thanks again to the AUGI team for all their support, in helping us put together the new format for Top DAUG.  Particular thanks to AUGI President, David Harrington, past-President, Mark Kiker and past-Board member, Bob Diaz.  Lastly, a big thank you to everyone who took part in this year's contest.  See you all at the Mandalay Bay in 2012? 

R