Showing posts with label Civil 3D skills test. Show all posts
Showing posts with label Civil 3D skills test. Show all posts

Thursday, 28 January 2016

Global Test Scores Posted by Civil Engineers

Civil Engineers from across the world have submitted several thousand KS test scores, in recent years. Let’s take a more detailed look at the people behind the headline stats.

We capture some interesting user demographic data at the start of each test session. This includes the following questions:

- Primary Industry/Discipline
- Primary Role
- Country & State
- Self rating
- How many years have you used this tool?
- Where did you learn to use this tool?
- How often do you use this tool? (Daily, weekly, occasional)
- How did you learn to use this tool? (On the job, Formal training, College)


The top 3 tests completed by Civil Engineers are, in order (by volume):

AutoCAD Civil 3D
AutoCAD
MicroStation


The results data comes from a wide range of countries and territories, primarily, US, Canada, UK, Australia, South Africa and Middle East regions.


Here are the overall results:

AutoCAD Civil 3D

Avg score/time for:

Regular users = 73.4% in 3294 secs
Part-time users = 68.2% in 4109 secs
1 to 5 years' experience (all types) = 63.4% in 3783 secs
6 to 10 years' experience (all types) = 75.5% in 3170 secs
11 to 15 years' experience (all types) = 76.3% in 3290 secs
On the job learning (all types) = 71.1% in 3463 secs
Formal learning (all types) = 67.4% in 3587 secs

The average AutoCAD Civil 3D self-rating score for Civil Engineers is: 3.9.


AutoCAD

Avg score/time for:

Regular users = 70.2% in 4867 secs
Part-time users = 63.1% in 4842 secs
1 to 5 years' experience (all types) = 55.3% in 5130 secs
6 to 10 years' experience (all types) = 70.4% in 4363 secs
11 to 15 years' experience (all types) = 69.9% in 3799 secs
On the job learning (all types) = 67.2% in 4576 secs
Formal learning (all types) = 55.7% in 4644 secs

The average AutoCAD self-rating score for Civil Engineers is: 3.7.


MicroStation

Avg score/time for:

Regular users = 69.6% in 5144 secs
Part-time users = 64.6% in 5455 secs
1 to 5 years' experience (all types) = 61.3% in 5656 secs
6 to 10 years' experience (all types) = 73.3% in 4842 secs
11 to 15 years' experience (all types) = 71.4% in 4338 secs
On the job learning (all types) = 71.6% in 4686 secs
Formal learning (all types) = 57.5% in 4339 secs

The average MicroStation self-rating score for Civil Engineers is: 3.6.


Trends:

Overall, regular users are more accurate and faster than part-time users.

Civil Engineers with 6 to 10 years' of experience with these tools are the most efficient. They reach a plateau in performance after c. 7 years and make no further productivity gains.

Civil Engineers who learn to apply these tools on projects are more accurate than those who have primarily formal training, but less practical experience. There is little to choose between the two, in terms of speed.

Civil Engineers are generally confident in their ability to use technical software tools on Construction projects, although overall, there is still room for further improvement in their core skills.

R

Sunday, 14 December 2014

Top DAUG 2014 - The Results



This year, KnowledgeSmart and AUGI again teamed up to provide an interactive skills assessment, across 10 popular Autodesk software tracks: 3ds Max, AutoCAD, AutoCAD Civil 3D, AutoCAD Plant 3D, Inventor, Navisworks, Revit Architecture, Revit MEP, Revit Structure and Vault.

Once again, we had some great prizes up for grabs.  Track winners walked away with a $100 USD Amazon voucher (a combination of high scores and fastest times determined the winner of each track), a glass trophy and the overall winner won a HP laptop and free pass to AU2015.


We spent several hours on Tuesday, setting up 16 networked PC's in the exhibition hall, at the AUGI stand.

 Next, 144 copies of Autodesk software needed checking, then all the KS sample data sets had to be uploaded to each machine and checked again.  Big thanks to Tony Brown for his assistance in accomplishing this task.




Here's how we looked when everything was finished:


The main competition ran over 3 days (1 x 3 hour slot on day one, 2 x 3 hour slots on day two and a final 1 hour slot on day three). Contestants had to answer 8-10 questions, using the 2015 version of each software title. Each session was limited to just 12 minutes, so people had to work fast! Special thanks to awesome AUGI team members, Kristin, Bob, Michael, Donnia and Richard, for briefing users and making sure everything ran smoothly.

Here are the Top DAUG's in action:


By the end of the contest, we posted 215 results.

Throughout the competition, we displayed a rolling list of the top 10 contestants for each track, on the AUGI big screen.

The Results

Congratulations to the following contestants, who won their respective tracks:


And a special mention to the overall winner of AUGI Top DAUG 2014:

Tracy Chadwick



Analysis

So, let's take a detailed look at the results of this year's Top DAUG competition.

Overall

No. of Tests Completed:  215
Overall Average:  59% in 9 mins 14 secs
(NB the average score for 2013 was 48%).



Track 1 - 3ds Max

Track Winner: Herb Wagner
Winning Score: 55% in 11 mins 35 secs

Top 2 Contestants:


No. Completed: 2
Group Average: 35% in 11 mins 45 secs



Track 2 - AutoCAD

Track Winner: Tracy Chadwick
Winning Score: 100% in 3 mins 50 secs

Top 10 Contestants:


No. Completed: 68
Group Average: 63% in 8 mins 2 secs



Track 3 - AutoCAD Civil 3D

Track Winner: Christopher Fugitt
Winning Score: 88% in 5 mins 55 secs

Top 10 Contestants:


No. Completed: 21
Group Average: 66% in 9 mins 12 secs



Track 4 - AutoCAD Plant 3D

Track Winner: Patrick Alonzo
Winning Score: 72% in 11 mins 30 secs

Top 4 Contestants:


No. Completed: 4
Group Average: 52% in 10 mins 38 secs



Track 5 - Inventor

Track Winner: JD Mather
Winning Score: 81% in 11 mins 25 secs

Top 10 Contestants:


No. Completed: 21
Group Average: 41% in 11 mins 9 secs



Track 6 - Navisworks

Track Winner: Michael Doty
Winning Score: 60% in 6 mins 0 secs

Top 10 Contestants:


No. Completed: 13
Group Average: 49% in 9 mins 45 secs



Track 7 - Revit Architecture

Track Winner: Erik Eriksson
Winning Score: 99% in 10 mins 40 secs

Top 10 Contestants:


No. Completed: 47
Group Average: 57% in 10 mins 51 secs



Track 8 - Revit MEP

Track Winner: Ken Fields
Winning Score: 100% in 9 mins 15 secs

Top 10 Contestants:


No. Completed: 24
Group Average: 64% in 7 mins 32 secs



Track 9 - Revit Structure

Track Winner: Rebecca Frangipane
Winning Score: 85% in 8 mins 30 secs

Top 9 Contestants:


No. Completed: 9
Group Average: 70% in 9 mins 51 secs



Track 10 - Vault

Track Winner: Bryson Anderson
Winning Score: 60% in 6 mins 50 secs

Top 6 Contestants:


No. Completed: 6
Group Average: 53% in 6 mins 17 secs



Most Popular Tracks

The most popular tracks, in order of completed tests, were as follows:

AutoCAD - 68 results
Revit Architecture - 47 results
Revit MEP - 24 results
Inventor - 21 results
AutoCAD Civil 3D - 21 results
Navisworks - 13 results
Revit Structure - 9 results
Vault - 6 results
AutoCAD Plant 3D - 4 results
3ds Max - 2 results

Total = 215 results


Honourable Mentions

100% Club

The following people achieved a maximum score of 100% in their track:

Tracy Chadwick - AutoCAD
Glen Sullivan - AutoCAD
Christine Noble - AutoCAD
Ken Fields - Revit MEP
(Plus Erik Eriksson scored a tantalisingly close 99% in the Revit Architecture track!)


Top Performers

The following people demonstrated impressive versatility by achieving above average scores in multiple tracks:

Chris Brown - AutoCAD, Inventor and Navisworks
Tracy Chadwick - AutoCAD, Revit Architecture, Inventor and Vault
Seth Cohen - AutoCAD and Civil 3D
Jason Dupree - AutoCAD and Inventor
Ken Fields - AutoCAD and Revit MEP
John Fout - Navisworks, Revit Architecture and Revit Structure
Rebecca Frangipane - Navisworks, Revit Architecture and Revit Structure
Brent McAnney - AutoCAD, Civil 3D and Vault
Kate Morrical - AutoCAD and Revit Structure
Mohaimie Mosmin - AutoCAD and Inventor
David Rushforth - Navisworks, Revit Architecture and Revit MEP
Maxime Sanschagrin - Revit Architecture, Revit MEP and Revit Structure
Jim Swain - AutoCAD and Inventor
Nick Tanner - AutoCAD, Navisworks and Revit Architecture


Boys vs Girls

Finally, let's take a look at the demographic breakdown of the competition. Out of 215 contestants, 185 were male and 30 female. The average overall performance for each group breaks down like this:

Girls: 64% in 9 mins 25 secs
Boys: 58% in 9 mins 11 secs




So, that's Top DAUG finished for another year. A thoroughly enjoyable 3 days at AU2014. 215 completed tests, across 10 popular tracks. Once again, the overall standard was extremely high, with some outstanding individual performances from our track winners and top 10 contestants.

Congratulations to all our winners. Thanks to the AUGI team for their support. Lastly, a big thank you to everyone who took part in this year's contest.

See you all in 2015 at the Venetian for more fun & games!

R

Sunday, 6 January 2013

KS & Global e-Training Links

KnowledgeSmart has been working with the team at Global e-Training, to map the results for our most popular test titles, to the corresponding modular training content, presented within the GeT Learning Management System.

These titles now have full links to learning enabled:


AutoCAD 2D fundamentals
AutoCAD 2D for occasional users
AutoCAD Civil 3D fundamentals (parts 1-4)
Navisworks fundamentals
Revit Architecture fundamentals
Revit Structure fundamentals
Revit MEP fundamentals (Mechanical)
Revit MEP fundamentals (Electrical)
Revit MEP fundamentals (Plumbing)

To follow in the Spring:

Inventor fundamentals (parts 1-4)
3ds Max fundamentals (parts 1-4)


Click here for a short video, which illustrates the user journey between testing and training.

R

Thursday, 3 February 2011

Assessing AutoCAD Civil 3D skills



Another new addition to the KS test library went live this week - AutoCAD Civil 3D 2011, module 1.

Now I'm not a believer that you can reliably create a 'general' Civil 3D test, so we're taking a slightly different approach with this one.

Because the KS tools offer AEC firms a tremendous amount of flexibility, in the way they can customise test content, we are giving firms the final say on what modules their version of the C3D test will contain. The daisy-chain feature, which allows admins to link a series of individual modules together to form a longer test, means that firms can mix & match the C3D content in a wide variety of ways.

So how are we approaching this task? Well, first, we're working with leading C3D author, Eric Chappell, to create 4 overall test modules, covering a range of C3D tools and commands. Each module will contain c.25 test questions, with the usual mix of task-based and knowledge-based exercises.

These are the main topics, which will be covered across the 4 modules:

  • Surfaces
  • Alignments
  • Profiles
  • Points
  • Sections & quantity take offs
  • Assemblies & Subassemblies
  • Corridors & intersections
  • Parcels
  • Grading
  • Pipes & pipe networks
  • Plan production
  • Data exchange
  • Rendering
  • Survey
  • Hydraflow


Module 1 is now available, covering Profiles, Alignments and Surfaces. Work has already begun on module 2, which will be finished in March. Module 3 will follow in April/May. The last module will be written in the summer; we'll review progress to date at that point and take feedback on any remaining features which haven't been addressed in modules 1-3.

Enterprise customers will see the new C3D content appear in their accounts this week. As always, we welcome your feedback on the material and also any topics you'd like to see covered, which are not mentioned in the summary list above.

R

Tuesday, 14 December 2010

KnowledgeSmart wish list

We've been holding a series of web review meetings with customers in recent weeks, discussing how their assessment programs are coming along, and what tools they would like to see next in the KS admin dashboard.

Here are the most requested items, the majority of which will feature in the next point release (scheduled for Feb 2011);
  • More charting; would like an easier way to tell (at a glance) what the main problem areas/training topics are
  • Would like to see industry comparisons for most popular test scores
  • Want to display more results per page
  • Would like ability to invite interview candidates more easily
  • Would like to control level of test report feedback for interview candidates
  • Ability to delete test scores
  • View results for linked accounts
  • Notify admin if test expiry date passes (but test not taken)
  • Change status of users (i.e. ex-employee, interview candidate to employee, etc.)
We'll also be adding basic branding options for firms, new chart styles in the dashboard and more content management options for test modules.

In addition, personal dashboards for users, basic survey tools, a community area for sharing test content and a new coaching mode are all coming up in the next few months.

Our rolling program of new test content authoring is ongoing. This week sees the release of our first Adobe InDesign module. More RMEP content will be out shortly, plus the first in a series of Civil 3D modules goes live in January. Intermediate RAC content, a Revit families module and basic Photoshop content will follow in early 2011.

If you have additional suggestions for new features or tools, we'd welcome your feedback.

R