Thursday, 8 December 2011

AUGI Top DAUG 2011 - The Results



The AUGI Top DAUG competition was introduced at Autodesk University, way back in 2000. The contest originally consisted of two parts, based on AutoCAD 2D knowledge.  This year,  AUGI teamed up with KnowledgeSmart to expand the range of topics to include 7 tracks; 3ds Max, AutoCAD 2D, AutoCAD Civil 3D, Inventor, Revit Architecture, Revit MEP and Revit Structure.

Here is an overview of the contest from AUGI Hot News:
http://augi.typepad.com/files/augi-hotnews-daily---wednesday-nov-30-2011.pdf

There were some great prizes up for grabs, too.  Track winners walked away with a $100 USD Amazon voucher (a combination of high scores and fastest times determined the winner of each track) and the overall winner also won a HP notebook PC and a free pass to AU 2012!

We spent several hours on the Monday, setting up 18 networked PC's in the exhibition hall, at the AUGI stand. All 126 copies of the Autodesk software needed checking, then all the KS sample data sets had to be uploaded to each machine and checked again.  Big thanks to Daniel Heselwood, from Evolve, for answering our call for help.  (We actually called Daniel to ask for his help with a missing xref - and 4 hours later he was still helping us finish up!).

Here's a snapshot of the stand, when we were all set up:


The competition ran over 3 days (2 x 3 hour slots on the first 2 days, with a final hour session on the last day).   Contestants had to answer a range of questions, using the 2012 version of each software title. Each session was limited to just 14 minutes, so people had to work fast!  The new format proved popular, with eager users queuing up to grab a spare seat.  Special thanks to AUGI team members, Bob Diaz and Michael Patrick, for briefing users and making sure everything ran smoothly.

Here are the Top DAUG's in action:


By the end of the contest, we posted 305 results, across 7 tracks.  On a conference venue web connection, with the added ingredients of free beer and wine, we had a completion rate of 99%, which was way ahead of our anticipated 95% target. (Our support team promptly rescued the missing two scores, so all completed tests were captured and logged on the AUGI dashboard).

Throughout the competition, we posted a rolling list of the top 10 contestants for each track, on the AUGI big screen.



The Results

Congratulations to the following contestants, who won their respective Tracks:

3ds Max Track Winner:  James Clarke
AutoCAD 2D Track Winner:  Brent McAnney
AutoCAD Civil 3D Track Winner:  Brian Hailey
Inventor Track Winner:  Joe Bartels
Revit Architecture Track Winner:  Aaron Maller
Revit MEP Track Winner:  David Raynor
Revit Structure Track Winner:  Rebecca Frangipane

And a special mention to the overall winner of AUGI Top DAUG 2011:

Brian Hailey



Analysis

So, let's take a detailed look at the results of this year's Top DAUG competition.

Overall

No. of Tests Completed:  305
Overall Average:  53% in 12 mins 56 secs



Track 1 - 3ds Max

Track Winner: James Clarke
Winning Score: 64% in 13 mins 55 secs

Top 10 Contestants:

James Clarke
Jesse Sandifer
Daniel Heselwood
Paul Mazzoni
Fernando Oliveira
Jens Tange
Matti Oopik
Yosun Chang
Douglas Bowers
Charlie Forsythe

No. Completed: 13
Group Average: 18% in 12 mins 15 secs
Top 10 Average:  20% in 12 mins 2 secs


Track 2 - AutoCAD 2D

Track Winner: Brent McAnney
Winning Score: 100% in 11 mins 25 secs

Top 10 Contestants:

Brent McAnney
Alex Lepeska
Scott Wilcox
Heather Shrieves
Timothy Vaughan
Ben Rand
Decio Ferreira
Jim LaPier
Richard Lawrence
Youssri Salman

No. Completed: 110
Group Average: 42% in 13 mins 43 secs
Top 10 Average:  82% in 12 mins 42 secs


Track 3 - AutoCAD Civil 3D

Track Winner: Brian Hailey
Winning Score: 100% in 6 mins 10 secs

Top 10 Contestants:

Brian Hailey
Richard Lawrence
Travis Winter
Bryan Thomasy
Kirk Noonan
Jeff Nichols
Bruce Klug
Brent McAnney
Charles D'Errico
Bill Neuhauser

No. Completed: 26
Group Average: 61% in 12 mins 33 secs
Top 10 Average:  90% in 11 mins 24 secs


Track 4 - Inventor

Track Winner: Joe Bartels
Winning Score: 80% in 8 mins 20 secs

Top 10 Contestants:

Joe Bartels
Gerrard Hickson
Andrew Warren
Bill Graham
Alex Karan
Marius Minnen
Anders Tokerud
Chris Brown
Scott Wayand
Curtiss Cooke

No. Completed: 27
Group Average: 59% in 13 mins 6 secs
Top 10 Average:  75% in 12 mins 9 secs


Track 5 - Revit Architecture

Track Winner: Aaron Maller
Winning Score: 100% in 6 mins 55 secs

Top 10 Contestants:

Aaron Maller
Brian Mackey
Anthony Tiefenbach
Steve Faust
Sean Darnell
Eric Bernier
Adam Ward
Douglas Bowers
Andrew Fisher
David Ivey

No. Completed: 84
Group Average: 79% in 11 mins 35 secs
Top 10 Average:  99% in 10 mins 7 secs


Track 6 - Revit MEP

Track Winner: David Raynor
Winning Score: 80% in 13 mins 25 secs

Top 10 Contestants:

David Raynor
John Karben
Fernando Oliveira
Jason Vaia
Philip Charlson
David Rushforth
Clifford Baker
Douglas Bowers
Maxime Sanschagrin
Paul Beseman

No. Completed: 23
Group Average: 54% in 13 mins 50 secs
Top 10 Average:  67% in 13 mins 49 secs


Track 7 - Revit Structure

Track Winner: Rebecca Frangipane
Winning Score: 98% in 13 mins 45 secs

Top 10 Contestants:

Rebecca Frangipane
Andrew Lawrence
Dezi Ratley
Brian Mackey
Eric Bernier
Matthew Hill
Tina Bos
Michael Patrick
Jason Rhodes
Fernando Oliveira

No. Completed: 22
Group Average: 56% in 13 mins 32 secs
Top 10 Average:  77% in 13 mins 5 secs




Further Analysis & Observations

Popular Tracks

The most popular tracks, in order of completed tests, were as follows:

AutoCAD 2D - 110 results
Revit Architecture - 84 results
Inventor - 27 results
AutoCAD Civil 3D - 26 results
Revit MEP - 23 results
Revit Structure - 22 results
3ds Max - 13 results


Training Needs Analysis

So what does all this mean, in terms of performance and training?

For 3ds Max, the top 10 training items were, in priority order:
Creation Parameters, Cross Sections, Map Channels, Map Coordinates, Object Data Flow, Object Modifiers, Object Properties, Pivots, Rotate and Space Warps.



For AutoCAD 2D, the top 10 training items were, in priority order:
Attaching References, Modifying References, Dimensioning Objects, Measuring, System Variables, Object Snaps, Arcs, Drafting Tools, Lines and Tangent.



For AutoCAD Civil 3D, the top 10 training items were, in priority order:
Changing a Slope, Pipe Networks, Sections, Corridor Surfaces, Corridors, Breaklines, Surfaces, Survey, Alignments and Parcels.



For Inventor, the top 10 training items were, in priority order:
Parameters, Tolerances, Features, Revolve, Sketch, Hole, Project Settings, Styles, Components and Parts.



For Revit Architecture, the top 10 training items were, in priority order:
Clipboard, Data Integrity, Modeling, Family Editor, Families, File Management, Worksharing, Room Tags, Walls and Annotation.



For Revit MEP, the top 10 training items were, in priority order:
System Inspector, Apparent Load Values, Creating Power Systems, Editing Properties, Modifying Light Fixtures, Panel Circuits, Panel Schedules, Photometric Web Files, Selecting Equipment and Electrical Settings.



For Revit Structure, the top 10 training items were, in priority order:
Families, Scope Boxes, Types, Visibility, Roofs, Floors, Slabs, Structural Deck, Annotation and Span Direction Symbol.



Revit Data

It's interesting to compare the Revit results, across the 3 disciplines, which arguably reflects where Revit currently sits, in terms of wider industry adoption.  Revit Architecture proved to be the most popular track and the overall experience of the users taking part in Top DAUG, suggests that this software is the most mature, in terms of individual expertise.  Revit MEP and Revit Structure were very close, in terms of overall performance, but considerably behind Revit Architecture, on a like-for-like productivity comparison.

RAC overall:  79% in 11 mins 35 secs
RMEP overall:  54% in 13 mins 50 secs
RST overall:  56% in 13 mins 32 secs


It's also worth noting that the average Revit Architecture score, for the top 10 users, was an impressive 99% in 10 mins 7 secs.  In fact, 31 users out of 84 posted scores of 90% or higher, which is outstanding.

As an additional comparison, we posted 28 results for Revit Architecture at RTC USA, earlier in the year.  The overall average for Revit Architecture, at the RTC 'Top Cat' contest, was 79% in 9 mins 7 secs. (NB the questions were different at RTC, but the level of difficulty was on par with Top DAUG).


Range of scores

Interestingly, across the 7 tracks, we saw scores ranging from 0% to 100%.  Here is a summary of both ends of the performance scale:

7 x 100% scores (1 x AutoCAD 2D, 1 x AutoCAD Civil 3D, 5 x Revit Architecture).
13 x 0% scores (1 x 3ds Max, 11 x AutoCAD 2D, 1 x Revit Architecture).


Honourable mentions

Along with our track winners, the following contestants deserve a special mention, for their performance in the competition:

Brian Mackey - winner of the RTC USA Top Cat contest in the summer, Brian placed second in the Top DAUG RAC track (scoring 100%), and also placed in the top 4 for RST.  An exceptional effort!

Brent McAnney - won the AutoCAD 2D track and placed in the top 8 for AutoCAD Civil 3D.

Eric Bernier - placed inside the top 6 for both the RAC and RST tracks.

Douglas Bowers - placed inside the top 8 for both the RAC and RMEP tracks.

Fernando Oliveira - placed inside the top 10 for the 3ds Max, RMEP and RST tracks.

Richard Lawrence -  placed inside the top 10 for both the AutoCAD 2D and AutoCAD Civil 3D tracks.

Heather Shrieves - placed inside the top 5 for AutoCAD 2D and top 12 for Inventor.

Jason Vaia - placed inside the top 5 for RMEP and top 18 for RAC.


Boys vs Girls

Finally, let's take a look at the demographic breakdown of the competition. Out of 305 contestants, 271 were male and 34 female.  The average overall performance for each group breaks down like this:

Female:  61% in 12 mins 37 secs
Male:  53% in 12 mins 57 secs



So, there we have it! A thoroughly enjoyable 3 days at AU2011. 305 completed tests, across 7 popular Autodesk software applications.  It's fair to say that the overall standard was extremely high, with some truly outstanding individual performances from our track winners and top 10 contestants.  

Congratulations to all our winners.  Thanks again to the AUGI team for all their support, in helping us put together the new format for Top DAUG.  Particular thanks to AUGI President, David Harrington, past-President, Mark Kiker and past-Board member, Bob Diaz.  Lastly, a big thank you to everyone who took part in this year's contest.  See you all at the Mandalay Bay in 2012? 

R

Friday, 25 November 2011

KS devs - a brief look ahead

As we head to the close of the year, I thought it might be useful to drop a quick update on the blog, about the work currently under development.

The next release is split into two parts.  Part one goes live next month and introduces some fairly big changes to the core KS system.  These updates involve introducing a hierarchical tree structure into the user accounts, sharing tests/modules between child accounts, moving users/results data between accounts, and including a new 'draft' phase when creating new test content.

Part two focuses on improving the 'Invites' area of the dashboard, with a number of great ideas from the KS user group being added to the system.  We'll also introduce personal dashboard pages, so individuals can track their own progress, over time.

A key theme for KS in 2012 is to continue the concept of 'links to learning'.  We are in discussion with a number of training providers, discussing how to make the journey from assessment to learning even smoother.

Exporting user and results data is another common theme, which will be addressed further in 2012.  Tighter integration of skills assessment data, into wider HR systems and company Intranets, has been a popular discussion topic this year.

We'll be adding some new question types to the dashboard in a future release, including survey tools and the ability to gather and present user feedback and opinion, as well as test results.  The KS 'community' area will also continue to grow in the months ahead, offering KS firms more opportunities to exchange test questions and material.

More benchmark stats and reporting options will be appearing in the next release, with more detailed geographical and industry breakdowns for AEC businesses to compare performance.  We'll also be adding some new content management tools, which will be particularly helpful to KS firms with large test libraries.

Speaking of which, the KS 'off the shelf' library continues to roll on, with new modules appearing each month.  Next up, we'll be adding more Civil 3D questions, more Revit Structure and Revit MEP questions, a Navisworks Manage module, an Ecotect module, plus some new material based on Oracle's Primavera PCM application.

So that's a quick overview of what's coming over the next 3, 6 and 12 months.  As always, we welcome your feedback, constructive comments and new ideas.  It looks like 2012 is shaping up to be a busy one.

May we (continue to) live in interesting times!

R

AU 2011

OK, it's that time of year again.  KS is exhibiting at this year's Autodesk University.  We’re also co-hosting the AUGI Top DAUG skills contest this year:  http://www.augi.com/autodesk/AU/augi-top-daug.

In spite of the ongoing financial woes around the world, this year's show promises to be one of the busiest yet.  No doubt there will be much talk of BIM, VDC, IPD, Cloud computing and, if rumours prove to be true, announcements about an Autodesk move into the PLM space.  We'll see!

For now, if you're heading out to Las Vegas next week, do please drop by the KS booth, or come and see us at the AUGI Top DAUG area.

We look forward to meeting you.

R

Tuesday, 4 October 2011

BIMshow Live

Amongst a host of upcoming Conferences, Roundtables and Webinars, all on the ubiquitous topic of BIM, one in particular stands out from the crowd.

In November, the inaugural BIMshow Live takes place at the Business Design Centre, in London.

HOK's prolific blogger and tweeter, David Light,  provides a handy summary on his popular BIM & Revit blog.

Here's the link:  http://autodesk-revit.blogspot.com/2011/09/bimshow-live.html.

R

KS Community - BIM Fundamentals

There is a wide range of quality assessment content, currently appearing in KS dashboards, written by AEC firms from all over the world. Topics vary from in-house CAD & BIM processes, to 'how to use the company Intranet', to BIM workflows and standards, to Project Management skills, and more.

An interesting trend is that firms are actively engaging KS to facilitate a 'community' pool of content, whereby firms, using the law of reciprocity, are volunteering to share their own, self-authored,  non-proprietary test content, with other KS firms, in exchange for the option to access material from the wider KS community.

It is an environment of cooperation and collaboration, which is refreshing in its simplicity and effectiveness.  And we're just getting started!  Already, we are seeing a stream of fresh, new, current material coming online, every month.  Oh, and the best modules are being added to existing KS libraries FREE of charge.

Under the umbrella of the free community content, something very interesting happened just last week.  A handful of firms contributed their in-house questions on 'BIM Fundamentals', to the community pool.

Now, whilst this module is a work in progress, I thought it would be interesting to send some invites out to the BIM community, to take a look at what has been authored so far.  To say that this has caused quite a lot of conflicting opinion, would be quite the understatement!  I knew that attempting to create a general BIM test was going to be tricky, as ‘good’ BIM practice is still subjective, across the regions.

The fact is, this set, is really meant to be ‘raw material’, i.e. not to be used by any firm un-edited. For example, there is a slight Revit bias in one or two of the technical questions, but that is principally because all these questions were written by AEC firms, many of whom are primarily Autodesk users. (One could also argue that this is reflective of industry, I suppose).  A Bentley or Graphisoft firm, will almost certainly have an opposing view and that's as it should be.

From my perspective, I think this set could be used by a firm (appropriately edited) to stimulate a more intelligent discussion in-house about BIM, BIM process, BIM strategy, what BIM is (and is not), and so on. The coaching notes are quite helpful to fill in the blanks in people’s general level of understanding.

We’ve had some interesting suggestions on additional topics to cover.  These include; IFC, IPD, IDM, information exchange protocols, clash detection and coordination procedures, BIM implementation plans, level of development standards, managing model merge from other consultants/sub-contractors, BIM workflows (who starts first, who has priority), 4D, 5D, COBie, field BIM, model validation, BuildingSMART, sustainability and the McLeamy curve.

The simple fact is that, globally, the industry is still at a comparatively early stage of BIM adoption and the questions in the set so far – all donated by firms in the field, using BIM every day on projects – reflect where they are currently up to, with their BIM thinking.

I think we’ll see some new BIM questions rolling in, over the coming weeks, as the discussion continues.  We’ve had such a wide range of comments already (in just a few days) so it is very clear that opinion remains divided, even amongst BIM ‘experts’! :)

On a related note, I was reading a thought-provoking article by Martyn Day, at AEC Magazine, entitled, 'The Trouble with BIM'.  (See link: http://aecmag.com/index.php?option=com_content&task=view&id=450).

As the KS community pool gathers momentum, I think we'll see a host of new test material coming online.  My view is that it’s easier for firms to start from an existing set of material, than to re-write the whole thing from scratch.  That's why having full editorial control over the content is vital.

As far as a universally accepted BIM standard goes, I think we're some way off realising this goal.  I'll leave the final word to one of the beta testers, who displayed an admirable sense of self-deprecation, and who clearly grasped the intent with which the beta invites were sent out; 'Does this mean I'm too DIM for BIM?' :)

R

Monday, 12 September 2011

BIM and Integrated Design - a new book by Randy Deutsch

Friend of KnowledgeSmart, Randy Deutsch, has written a fantastic new book, called, 'BIM and Integrated Design: Strategies for Architectural Practice'.

Unlike many of the recent books and manuals on BIM, this book is devoted to the subject of how BIM affects individuals and organizations working within the construction industry.

As one of the most popular speakers and presenters on the AEC circuit, Randy is well placed to comment on the journey that many firms are currently making towards BIM adoption.  In his book, Randy discusses the implementation of building information modeling software as a cultural process with a focus on the technology’s impact and transformative effect—both potentially disruptive and liberating—on the social, psychological, and practical aspects of the workplace.

BIM and Integrated Design answers the questions that BIM poses to the firm that adopts it. Through thorough research and a series of case study interviews with industry leaders, this book helps us learn:

•  Effective learning strategies for fully understanding BIM software and its use
•  Key points about integrated design to help you promote the process to owners and your team
•  How BIM changes not only the technology, process, and delivery but also the leadership playing field
•  How to become a more effective leader no matter where you find yourself in the organization or on the project team
•  How the introduction of BIM into the workforce has significant education, recruitment, and training implications

Covering all of the human issues brought about by the advent of BIM into the architecture workplace, profession, and industry, BIM and Integrated Design shows how to overcome real and perceived barriers to its use.

Click here for a link to Wiley's website, for more information and to order a copy of Randy's book.

R

Sunday, 11 September 2011

Links to Learning

An important theme for KnowledgeSmart this year, is 'Links to Learning'. That is, making the connection between highlighted skills gaps and corresponding training content, as simple a journey as possible for KS customers.

The latest KS release takes an important step along the path to achieving closer links to relevant, targeted, training material.

Here is a summary of how AEC firms can now display focused, modular training content, as part of their skills assessment results analysis.

We have added a new stage to the question create/edit process, called 'Learning material'.


From here, KS admins can link modular training material to their KS question library. The key is to map the training content to the training tags (meta data) for each question. The training content can be a variety of formats.  For example in-house training notes, PDF help-sheets, Intranet pages, etc. Or, alternatively, training material from a 3rd party, such as video tutorials or web based training.  KS admins can use the text box to add notes or web links and the file upload tool to assign files or video links.  (NB to ensure that any web links open in a new browser, please use the 'Insert/Edit URL link' icon, enter the relevant web address(es) and select 'New window' in the dropdown).


Next, there is a new option in Settings > Test Feedback, called 'Feedback, Coaching & Learning on Report'.


Checking this option, will prompt a new section to display in the user test report, at the end of their test session.  From here, users can access the wider training content, relating to questions where they dropped marks. So the idea is that, KS Coaching notes provide feedback on each specific task. Whilst Learning notes provide broader feedback on the topic being covered by each question.


We are in discussion with a number of 3rd party training providers, to map their learning material, to the topics being covered by KS skills assessments, across a range of software titles.  In the Results > Training Requirements page, we have listed a number of training providers, who are developing modular content, which can map to the KS training tags.  (KS is not a reseller for any of these companies, but we are working together to provide closer links to their respective courseware solutions).


This development is another important step along the road to achieving a continuously improving journey, between assessment and learning.

R

Managing your Users

We recently updated the Users section of the KS dashboard.  Here is a guide to the new features.

When you log into your KS dashboard, you'll see some new sections in your menu sub-navigation:


It is worth pointing out that changes made to the user data in your dashboard, do not affect your results data. This is simply an area where you can manage your user information.

Adding Users

There are a number of ways to add user records to your dashboard.  First, when you send assessment invites, the user record is automatically added to your user list (unless it is already present, in which case it will not be duplicated).

Next, you can go to the 'New User' page and add individual records from here.


Third, we have improved the way that user records can be uploaded via csv file.  You can also export your user data as well.  Go to 'Import/Export User Data'.  In the Help notes box, you will find a link to a formatted sample csv template.


Save the template and copy across your user list from MS Excel, using the same 10 columns that you see in the template.  They are: Email, First name, Last name, Status, Datafields 1-5 and Additional Information.


Browse for your csv file and use the 'Import User Data' tool to upload your new user list.

Please not that if you don't use the correct format on your csv file, then the KS system will not accept the imported file.

You can also export your user list to csv, using the 'Export User Data' tool.


Additional Datafields

You can capture 5 additional fields of information for each user, by selecting the magnifying glass icon, next to the user record.


When you click the magnifying glass tool, you will see a dropdown display, where you can edit a number of fields, including changing the status of a user (i.e. from Interviewee to Employee, or Employee to Ex-Employee).


You can use the information captured in the 5 datafields to filter your assessment results, in the Results page of your dasboard.


You can add custom labels to the 5 datafields, by going to Settings > User Datafields. Input your new labels and hit 'Save'.  When you go back to the Users page and view individual records (by selecting the magnifying glass tool), the new labels will display against each user.



Filtering User Records

We have added searching and grouping functionality to the Users page, to make it easier for KS admins to filter their user data and create sub groups of user records. This is especially useful, when inviting different teams to take a skills assessment.

Select the 'Show search' and 'Show groups' tools, on the orange bar located at the top of the user page.


This opens up two new fields, where you can search against a number of user parameters, including, name, username, email, user status and the 5 additional datafields.


When you have run your search, select the user records by checking the box next to the user name (the top box is a 'select all' option). Then enter your group name in the 'Create New Group' field and select 'Create'.


You can view your new groups by selecting the dropdown field, in the Groups table. You can also add/remove new users to/from existing groups.


Inviting Groups

To invite a sub-group of users to take a KS skills assessment, go to the Groups table, select the group in the dropdown, check the box(es) next to the names of your invitees and hit the 'Invite selected users' button.


This will pre-populate your user list on the assessment Invites page. Just edit your invite text and email header and hit 'Send Email'.


So managing KS user data is now even easier. Remember, changes made to your user records have no direct impact on your live results data.

R

Saturday, 10 September 2011

Evolve - Training Help Sheets for MicroStation and AutoCAD

Leading CAD & BIM Consultancy, Evolve, led by Principal Nigel Davies, have developed an excellent training resource for AEC firms using MicroStation and AutoCAD software.

Designed to align with the keywords and meta tags associated with the assessment questions in the KnowledgeSmart library, AEC firms can now deliver high quality, targeted training material to their teams, matching precisely with highlighted skills gaps, for a fraction of the price of traditional AutoCAD or MicroStation training courses.

This is another important step in our aim to provide Links to Learning, across a range of popular AEC technical software applications.

For more details, go to http://www.evolve-consultancy.com and request a free sample.

R

White Frog - Modular Courseware for Revit

As part of our 'Links to Learning' development, we have been working closely with the team at White Frog Publishing, to write new assessment content for Revit, across the 3 core disciplines (Architecture, Structure & MEP).

The Principals of WF have exceptional industry pedigree.  Peter Routledge was MD of Tekla UK and has a huge amount of experience in the AEC sector.  Paul Woddy is one of the UK’s leading independent Revit authors and trainers, having been involved with the product since the very early days, before it was acquired by Autodesk.

The WF training materials cover 13 core modules, plus a further 6 discipline-specific topics.  Similarly, the KS assessments present 95 new questions, across these 19 modular areas.  The first new module, Revit Architecture 2012, goes live next week.  The questions are written to comply with the AEC (UK) BIM Standard.

The idea, very simply, is to dovetail the KS results and highlighted skills gaps, into the appropriate modular courseware developed by WF.  Unlike other Revit training resources, or video tutorials, the WF content is intended for instructor-led training and designed with the trainer in mind, whether a full-time professional trainer or a part-time internal trainer, it includes all the tools to deliver high quality consistent training for AEC firms. In addition to delegate notes, the trainer gets access to trainer notes, slideshows and datasets to support the lectures, as well as more datasets, drawings and AVI videos to support the practical exercises.

For more details, go to http://www.whitefrog.co and register for a web demo.

By the close of the year, our intention is to create assessments for RAC 2012 (metric & Imperial), RST 2012 (metric & Imperial) and RMEP 2012 (metric & Imperial).  More information and live product demonstrations of the combined assessment & training solution from KS & WF will be on display at Autodesk University.

R

Content Studio - a new way to manage Revit Families

Managing Revit Families is one of those thorny subjects that perplexes even the most BIM-savvy AEC firms.

Chris Senior, Principal of Revit Factory, has created a brilliant new tool for helping AEC firms to manage their Revit content.  I have had the pleasure of working with Chris for a number of years.  He is one of the most experienced independent Revit consultants and trainers in the UK, and a regular contributing author to the KS assessment library.

Content|Studio is a digital library solution used to collect, manage and share Revit Families in a quality controlled environment. Now, AEC firms can file and index all their Revit files in one consistent electronic filing structure.

Chris explains, 'The advantages in efficiency will repay the investment made by a firm in months rather than years.  Content|Studio offers much more than a conventional filing system; information is stored in an intelligent quality controlled library making it easy to capture, manage, share and use Revit files'.

For more details, click here and register for a free demo.

R

Friday, 2 September 2011

CADsmart software - end of an era

Some nine years ago, we started developing software plug-ins for AutoCAD and MicroStation, which delivered a practical skills assessment experience, linking to the API of the Autodesk & Bentley applications.

This was a successful formula for many years.  But, as is often the case with technology, time moves on and, particularly with the impact of BIM on the AEC sector, we realised that we needed a more flexible approach, in order to embrace the future direction of our industry, with respect to skills assessment.

We also experienced many headaches trying to keep up with API changes across a broader range of tools, such as Revit, Civil 3D, Bentley Architecture, Rhino, the list goes on.

Hence the KnowledgeSmart web based testing format was created a couple of years ago and continues to thrive, in spite of the challenging economic backdrop.

So, it is with mixed feelings that we announce the official 'retirement' of the 'CADsmart' suite of tools, at the end of September.  Farewell old friend.

R

AUGI Top DAUG




Breaking news..

KnowledgeSmart and AUGI will be working together to provide a broader range of assessment topics for this year’s AUGI Top DAUG skills contest, at Autodesk University, in November.

In recent years, the contest has focused exclusively on AutoCAD 2D skills.  However, for the 2011 Las Vegas conference, we will be expanding the range of topics to include, Revit Architecture, Revit Structure, Revit MEP, AutoCAD Civil 3D, Inventor, 3ds Max (and, yes, AutoCAD 2D will still be represented!).

Based on our recent experiences, delivering a successful formula for the RTC Revit Top Cat contests in Australia and the US, we are confident that the new Top DAUG format will be a hit with AU regulars.

Contestants will be presented with a mix of knowledge and task based questions, using the 2011 release of Autodesk’s most popular AEC software applications. Because the range of titles is so diverse, there will be no grand final this year.  Just a straight shoot-out, for the title of Top DAUG for each vertical!

As for rumours of a heavyweight Revit title fight, between the RTC Top Cat and the AUGI Revit Top Daug, at the time of writing, we can neither confirm nor deny this story... J

More to follow on this topic in the months ahead.

R

Friday, 19 August 2011

CAD & BIM Manager's Survey 2011


The UK Architectural, Engineering & Construction CAD & BIM Manager's Survey 2011 is now open for your input!

Organised by respected independent BIM consultants, Evolve, the survey is one of the most important of its kind in the UK construction industry, comparing the roles performed by CAD, BIM and Technology Managers in the sector. Topics covered include, roles and responsibilities, salaries, the ratio of CAD & BIM support staff to users, opinions on support providers, quality of training, standards and much more.

Evolve Principal, Nigel Davies, comments, 'Responses in previous years have been phenomenal, so we encourage people to take part this year to help paint a true picture of CAD and BIM within the UK as we continue to slowly progress out of a deep recession, and BIM begins to become more and more important.'

Nigel adds, 'The questions are mostly multiple-choice so the survey should take around 20-30 minutes to complete fully. Please feel free to leave any questions unanswered if you wish, there are no required fields.'

Here's the link: https://www.surveymonkey.com/s/UKCADBIMManagerSurvey2011

Results will be published in the Autumn.

R

Monday, 1 August 2011

KS Library update



We've had a few mails in recent weeks, asking about updates to the KS library, so I thought I'd drop a few lines on the blog about what we're working on..

Updates to existing data sets are a common theme this summer.  We work with a network of independent specialist authors, to create the KS OTS (off the shelf) titles. Juggling schedules with talented authors is challenging - as their skills are always in demand!  However, we are making good progress, with a rolling programme of updates and new releases, taking us into the autumn period.

So, in approximate date order:

AutoCAD 2D fundamentals 2012 - Aug 11
AutoCAD 2D 2012 - Xpress - Aug 11
AutoCAD 2D for occasional users 2012 - Aug 11
AutoCAD Civil 3D 2011 (part 3) - Aug 11
RAC 2012 fundamentals (metric), new questions - Aug 11
RST 2011 fundamentals (metric), including new coaching notes - Aug 11
RST 2011 - Xpress (metric) - Aug 11
RST 2012 fundamentals (metric) - Aug 11
RST 2012 - Xpress (metric) - Aug 11

Google SketchUp fundamentals - Sept 11
RAC 2012 fundamentals (metric), including new coaching notes - Sept 11
RAC 2012 - Xpress (metric) - Sept 11
RAC 2012 fundamentals (Imperial) - Sept 11
RAC 2012 - Xpress (Imperial) - Sept 11
RST 2012 fundamentals (Imperial) - Sept 11
RST 2012 - Xpress (Imperial) - Sept 11

Inventor 2012 module 1 - Oct 11
Revit for Interiors 2012 - Oct 11
Revit Standards module 1 (community resource) - Oct 11
RST 2012 fundamentals (metric), new questions - Oct 11

AutoCAD Civil 3D 2011 (part 4) - Nov 11
Ecotect 2012 - Nov 11
Navisworks Manage 2012 - Nov 11
Revit for occasional users 2012 - Nov 11
RMEP 2012 fundamentals (metric), new questions - Nov 11

We'll also be adding 2012 data sets for the following titles (dates TBC):

Revit Content Creation
Revit MEP fundamentals (Mechanical) (Imperial)
Revit MEP fundamentals (Plumbing) (Imperial)
Revit MEP fundamentals (Electrical) (Imperial)
AutoCAD Civil 3D (parts 1-4)
3ds Max (modules 1-4)

So that's about it for the next few months.  These dates might be subject to change, but this gives you a pretty good idea of where we're aiming to be by the close of the year.  There are a few more ideas in discussion, including topics like Codebook, Bentley Navigator, MS Excel, MicroStation Select Series 1-3, Adobe Illustrator and USACE Attachment F.  We'll keep you posted..

R

Friday, 22 July 2011

Revit Standards - update & comments

OK, so we’ve had quite a few comments on this topic over the past week.  Most in favour, but some who are having a hard time believing that a ‘generic’ pool of material can have relevance to the way they work.  I accept both viewpoints, but ultimately my feeling is that it’s worth a try.

There are a number of firms who have agreed to pitch in and contribute a couple of questions, including, HASSELL, Opus, NBBJ, Capita, KEO, Cox, Woodhead, BDP, M+NLB, Jasmax, Zaha Hadid, HOK, Stantec, ARUP, Geyer, Levitt Bernstein, Pascall + Watson and FRCH.

If a handful of key people can find a spare couple of hours, over the next 5 or 6 weeks, we should have a sufficient volume of questions to comprise Revit Standards module one – Project start-up.

BTW, if anyone else wants to help, the proposition is pretty straightforward.  This is a FREE community resource.  (KS does not need to monetise this in any way.  If it works out, the association with a useful, helpful, relevant Revit community initiative will be worth the effort).  But there is a cost of entry, to join the club, if you like.  To use banking parlance, in order to make a 'withdrawal', a firm must first make a 'deposit'.  The cost of entry, to access the library of tools, is one new question, for each module in the Revit Standards series.

So we currently have 20+ firms writing a couple of questions apiece (either task based, or knowledge based), on any aspect of their choosing, on ‘good practice’ for project start-up, using Revit.

For each question, the required info is as follows:
- Question copy (i.e. name, summary, question wording, answer(s), training keywords) using a sample s/s template (which we can happily email across).
- Coaching notes, which either walk a user through the correct steps to successfully complete a task, or provide a line or two of background info, about the subject being covered in the question (and why it is important).
- Sample Revit file (if applicable, i.e. for task based questions).  If possible, we have requested that any sample data sets are created using Revit 2012.

We've suggested a 'deadline' for the first cut of material, of end August.  (I fully appreciate that everyone contributing to this experiment is super-busy, and doing this in their personal time, so we very much appreciate the willingness to participate).

We’ll then compile the questions into the KS system, present them in module format, and release the live content into the KS library.  I would anticipate that firms will then make the appropriate edits locally, to make it fit for purpose in their own environment.

It has been interesting to read the range of comments which have filtered through on this subject.  Here are a selection of (anonymised!) replies, both for and against.

Glass Half Full

"I think it would be a fantastic idea to have an assessment of someone’s understanding of either industry best practice or corporate standards. Essentially…’take this course to learn how we do things’…’now take a test to demonstrate you understand.’"

"This all sounds achievable - I'll programme in some time to complete.  Am excited to be a part of this. There is definitely common ground and (global) best practice for all of those core areas - I'd love to test our teams here on Project startup process....we have one in place but no one bothers to use it, which in turn leads to a higher support component..."

"I am a strong believer in model managers and have previously had a check list for model managers to go through on project start-up and throughout the project. I could see this module of KS acting as a tool to help analyse the best person to perform this role."

"Quite spookily we’re currently discussing (the need for) a Revit Project Setup KnowledgeSmart test to ensure Revit project team members understand the ‘absolute must do’s’ before being allowed to start work on a live Revit project."

"Yes, yes yes yes yes!! Count us in. I agree that it’s a great idea and is definitely worth a try."

"Love the idea – I’ll give you 2 items."

"You asked “is there sufficient common ground, to enable the creation of a 'generic' set of Revit standards questions” and I think that the crowdsourcing approach answers that question perfectly. I suppose you can assume that if your most KS friendly firms can’t develop a standard set of questions, it probably can’t be done. But I think it can be."

"Coincidentally, this is something that (firm name) have been discussing recently – a test or checklist of some sort for project standards (setup, etc…). There are generic fundamentals prior to the more firm-specific things that all projects have to account for. We would argue that the more industry standards we can set now, the better off the industry will be in the future. CAD standards developed completely out of control and I would hope that we all learned a good lesson from that. . I love this concept!"

"Anything with a standard is tough.  I think crowd sourcing assessment material is a great idea."

"I think this is a great idea, one of the main things being that once again this will be a collaborative effort, and bring the BIM community together to hopefully create a common best practice series of ideas that aren't company specific. I love the idea of open source approach."

"I really like the “crowdsourcing” idea for a Revit standards assessment. Our (leadership) group just finished up trying to create a “BIM assessment” – something that would tell us if people “get” how BIM is different from CAD. I think we’d all agree that it would’ve been great to have a pool of questions to start from and tweak, rather than start from scratch!"

"Your email is well timed, as we have just been discussing the very issue of standards, their implementation and assessment. I think the idea is a good one and with all new ideas, even if it isn’t perfect the first time, it very soon will become perfect as everyone chips in. I feel this will be invaluable and we are happy to participate."

"Anyone having spent time in Revit or in the Revit community knows that the attitude is a bit more open for sharing content and procedures than has previously been expected from other software packages.  My observation is that while every firm has a twist on a standard, there are indeed commonalities for those best practices.  The concept of sharing these should not be viewed with trepidation but embraced as a learning opportunity. I think testing on best practices is a good idea to ensure the data has been communicated well to the users.  At the very least it will make them think about those best practices again to remind them it is a concern and is important."

"I think this is a very workable idea."

On The Fence

"Personally I really like the idea however I am a little dubious as to whether or not it will work. It’s hard enough getting a consensus about anything BIM related with just a handful of people, never mind opening it up to all. I agree that it does seem pointless everyone going through the same process but I kind of think this is unfortunately engrained into the architectural world so deeply that I can’t see it changing. The jury is out as to if the open standards would work.. my gut feeling is not.. I would love to be proved wrong!"

"Pulling together content, be it families, libraries, best practise in the way you have suggested is a great idea  (assuming people are happy to participate)."

"I think there is something to be said for some well managed crowd sourcing. You have a massive task ahead trying to achieve this, so good luck!"

"This could have some legs, you just need to get the right people to submit questions... I will find some time to submit a couple to see where this goes."

"We use Revit standards internally based on the AEC (UK), Autodesk and other published standards and best practice guidelines and our BIM process methodology draws on the work of Penn State, Georgia Tech, the VA and buildingSMART.  BIM standards are emerging, but there is still nothing definitive, maybe the US NBIMS with fill that gap.  Like CAD standards of old, I believe that Revit standards will tend to be developed by companies internally, with reference to external standards. We have some very specific requirements which fit with our process and back office systems, so I couldn't imagine relying on external standards alone."

Glass Half Empty

"What makes a firm unique? Why you would hire one Architect over another is the unique way they design. Each firm comes to solving the problem of design in a different manner. To me the CAD/BIM world is just a set of tools we apply to the way we design. Those tools must be flexible enough to be applied to very different methods of design from conception to working drawings. The more we try to find standards the more we will restrict our different ways of designing."

"The idea is sound, but I suspect you won't find anyone who will actually put in the time. Crowdsourcing sounds great in theory, but the reality is that most people are too busy (or too selfish!) to share."

"Sorry, we don't have time to get involved in this.  But keep us posted, as it sounds like an interesting idea, in theory, at least."

"A universal standard is the paramount end goal, yes.  But do I think it's valuable for KS to get involved?  Not really."


So there you have it.  An interesting mix of reactions, but on the whole more positive than neutral or negative comments.  Of course the 'proof of the pudding', is in whether people can actually free up the time to write a question or two.  Any crowdsourcing initiative lives or dies by the momentum generated by a few key people.

I think back to last year's KA Connect, where Chris Parsons told the story of 'shirtless dancing guy', which provides an interesting view on the significance of momentum and the all-important, 'tipping point'.  http://sivers.org/ff

So, here we are, in the field.  Anyone care to dance? :)

R

Friday, 15 July 2011

Revit Standards

There has been much discussion in recent months, surrounding the topic of common methods of 'good' working practices, in particular for firms using Revit.  Earlier this week, I attended the Leeds RUG and enjoyed listening to visiting speaker, Neil Munro, from Opus NZ, talking about Revit Standards & Protocols.

Just a few weeks ago, whilst attending RTC AU, I sat in on an interesting presentation from the committee responsible for the Australia & New Zealand Revit Standards document.  Here is a link to the ANZRS site, which is definitely worth a look:  http://www.anzrs.org/blog.

And of course we have our own home-grown initiative, the AEC (UK) BIM Standard for Revit, which has been making impressive progress over the past couple of years, with a strong committee-led approach:  http://aecuk.wordpress.com.

There is a useful discussion thread on standards to be found over on Revit Forum.org.

And this month, the Twittersphere has been ablaze with postings about the new Open Revit Standard, led by industry stalwarts, Sean Burke and David Fano, among others.  For those Twitterers among you, search for the hash-tag #OpenRevStds, to follow the ongoing dialogue.  And check out the new wiki-style site: http://openrevitstandards.com.  This promises to be a fascinating initiative.

So I got to thinking.. as Revit is already a hugely popular topic for skills assessment material, how useful/feasible would it be, to create a series of KS modules which cover 'good' practices for Revit standards?   

I'm thinking of modules covering the following core areas:  Creating a Revit template file, Installation & setup of Revit, Project start-up process, Work sharing, Creating & managing Revit Families, Collaborative working, Draughting standards, Model management.  Anything else?

And whilst I appreciate that such modules would naturally vary from firm to firm, is there sufficient common ground, to enable the creation of a 'generic' set of Revit standards questions, which could then be edited by individual firms, to wrap around their own in-house standards & protocols?  Would it be too much to suggest that anyone starting a new Revit project, must first demonstrate to the team Principal or Job Captain, that they have taken (and aced) the project start-up module?  That they do, in fact, understand key 'do's and don'ts' before being allowed to start work on a live Revit project.

Further, what if, instead of one individual author doing all the heavy lifting on this one, we employ the concept of 'crowdsourcing' the content, one module at a time?  For example, if we start with Revit project start-up process and deliver a module of 20 questions on this subject.  This makes more sense than 200+ firms all reinventing their own version of essentially the same wheel! In my experience, it is far easier to start with an existing set of data, than to write everything from scratch, each time.

So, to start the ball rolling - and to see if the concept has potential - what if we invite 10 KS-friendly firms, to write two questions each, on general Revit standards for project start-up?

My thinking is that this could be a 'free' resource, for any firms using Revit, who want to assess their users' understanding of in-house standards.  The 'cost' of entry to the club, however, could be to submit two (or more) questions to the overall Revit Standards 'pool'.

We have a number of Revit specialists whom we can call on, to sanity check the content and generally ensure that it is fit for purpose, prior to release into the Revit Standards library.

We'll be speaking to the team behind the Open Revit Standard, in more detail over the coming weeks.  It will be interesting to see if we can align any assessment content for Revit Standards, with the community-led initiatives currently taking place.

More to follow on this topic..

R