Sunday, 20 July 2014

Most In Demand AEC Software Titles

We read an interesting piece on the Black Spectacles blog this week, about the primary software skills required by leading design firms in the US.

Data was collected from a survey of 928 job postings at the top 50 architecture firms.

In summary, for software skills, over 70% of architecture jobs require Revit skills and over 50% still require AutoCAD skills.  The #3 software skill required is SketchUp. Hand-sketching was only mentioned in 4% of jobs listed.

In contrast, here are the top 15 most popular skills assessment topics from the KS library. This list is compiled from data gathered from a list of Architecture and Engineering firms in the UK, US, Canada, Aus/NZ, SA and ME regions. The split between A & E firms is more or less equal.

1) Revit Architecture (Autodesk)
2) AutoCAD (Autodesk)
3) Revit Structure (Autodesk)
4) Bentley MicroStation (Bentley Systems)
5) BIM Management (software vendor neutral)
6) AutoCAD Civil 3D (Autodesk)
7) Revit MEP (Autodesk)
8) Navisworks (Autodesk)
9) SketchUp (Trimble)
10) Adobe InDesign (Adobe)
11) Rhino (McNeel)
12) Bentley AECOsim Building Designer (Bentley Systems)
13) 3ds Max (Autodesk)
14) ArchiCAD (Graphisoft)
15) Adobe Photoshop (Adobe)

Not surprisingly, almost half of the titles listed are from the Autodesk stable. Bentley Systems and Adobe have two apiece, with Graphisoft, Trimble and McNeel making up the rest.  Interestingly, general BIM knowledge (no particular vendor) is a top 5 topic.

Here is a summary of the average score & time for the general level assessments, for each title:

1) Revit Architecture - 69% in 63 mins
2) Revit MEP - 65% in 76 mins
3) Adobe InDesign - 64% in 45 mins
4) Revit Structure - 63% in 64 mins
5) Adobe Photoshop - 62% in 48 mins
6) AutoCAD - 62% in 73 mins
7) Bentley MicroStation - 60% in 84 mins
8) Rhino - 57% in 42 mins
9) BIM Management - 56% in 44 mins
10) ArchiCAD - 56% in 51 mins
11) AutoCAD Civil 3D - 56% in 76 mins
12) Navisworks - 54% in 49 mins
12) SketchUp - 54% in 49 mins
14) Bentley AECOsim Building Designer - 54% in 71 mins
15) 3ds Max - 53% in 55 mins

In many cases, the data illustrates a clear need for additional basic skills training, if the most popular software tools used in A & E businesses are to be used productively and efficiently on Construction projects.

R

Wednesday, 25 June 2014

KS Summer Devs Update

As we approach the halfway point of the year, we thought it might be useful to provide a brief update on our current plans, including a look at what's coming up in the Fall release of the KS tools.

KS Software

There are two main themes for our next core system update: new question types and individual user dashboards.

The KS user groups have provided clear feedback that they would like to see our next release focusing on these two key issues. First, providing the means to capture user opinion, as well as test scores, including new test questions (i.e. matching list, fill in the blank, matrix, short essay) and introducing survey options (i.e. likert scale, free text).

Next, a new method of providing KS admins with the ability to more easily chart results on a per-user basis. And allowing users themselves to log in to their own KS dashboard, to monitor their own personal assessment history, charts and benchmark comparisons. This new UI will be mobile-friendly and HTML-5 compatible.

We also recently introduced a new menu option to capture results data on a per module basis.



KS Library

Our library updates are progressing well. All main Autodesk titles will be available in 2013, 2014 and 2015 formats, in the next couple of months. We also plan to update our SketchUp, Adobe and Rhino data sets this year.

Our Revit MEP fundamentals metric data sets will be available in July. And our Revit MEP advanced modules will also be live next month. Advanced Civil 3D, Synchro and MS Excel are all on the 'to do' list in the coming months. Additional AutoCAD 2D  and advanced RAC questions are also in development during the summer months.

We'll keep you posted as these updates progress.  As always, any feedback and suggestions on new tools and library titles are warmly received.

R

Wednesday, 7 May 2014

Structured Knowledge Management Using Skills Assessments - White Paper

The team at A 2 K, led by ANZ Consulting Manager, Sean Twomey, have put together an excellent, thought-provoking paper which examines knowledge assessment and the benefits and process of using skills assessment for benchmarking in the field of design technology.

The paper has a particular focus on the CAD/BIM sides of design and engineering companies, but the process applies to all facets of business. The examples referred to in this paper are based on
Autodesk environments and processes (e.g. CAD and BIM), but are also applicable to Adobe, Graphisoft,
Bentley and similar packages.

In his detailed analysis, Twomey looks at the process of analysing and harnessing assessment data to calibrate goals, to build major improvements and measure progress, with one of the most practical and visible benefits of knowledge management being the ability to facilitate decision-making capabilities. He also considers the challenge of overcoming the innate fear of assessments.

Here's a link to download the white paper.

R

Sunday, 16 March 2014

KS Devs Update - Spring 2014

Q1 is almost done and we've had a busy start to 2014. In fact, if a skills assessment service is a good bell-weather for the general well-being of wider industry, then this is our best start to the year since before the recession began. That's encouraging!

So here's a look at what we've been working on in the past few months:

We started off with a general update to our underlying platform, trading up to .NET 4.5. And we added a new issues-tracking system for logging bugs and system updates.

The Results > Group Comparisons page has a new tool which allows admins to capture and save results comparison charts.


The Results > Data page has new search and export options for capturing 'Skipped' question data:



The Results > Training Requirements page has 2 new tools which display charts for 'Training requests' and 'Skipped' questions:


At the end of a test session, a new advisory note displays, which confirms how many questions have been skipped:


We added some new options for creating dynamic groups on the Users and Results pages:


We added dynamic grouping to the Invites > History page:


Plus the ability to recreate existing dynamic groups in other areas of the system. For example, if you have an existing dynamic group on the Users page, you can now recreate the same group on the Results or Invite History pages. This tool can be found on the Settings > User Datafields page:



Admins can now check the progress of existing 'in progress' tests, by clicking on the new link in the Invites > History page:


We added a new filter on the Invites > History page which allows admins to display a sub-list of users who have not yet received any invites:


We added a new tool which allows admins to anonymise charts on the Results > Performance Spread and Results > Group Scores pages:




The following tasks are currently being tested in our staging area and will shortly be deployed to live:

The test start page has been tidied up and re-formatted.

We fixed a bug which caused tests to display an error if they contained more than 71 questions.

We added the ability to create more than one draft invite template on the Settings > Test Invite page.

We added rich text options, including HTML links, to the Test Invites page.

We added an advisory which is triggered if a user enters too many answers for pick list questions. It says: 'This question has a maximum of x correct answer choices'.

We are adding new options on the Results > Data page to filter and export the user data capture info (which is captured on the pages at the start and end of a test).

We are adding a new page called 'Module Performance', which allows admins to analyse results data on a per module basis.

We are adding new search options on the Accounts > Move Users page.


In addition to the system updates, we are also continuing to work on the KS library. All existing Autodesk titles are available in 2014 format. Plus the following: Vault, Civil 3D advanced, Revit MEP advanced, Revit MEP metric, AECOsim Building Designer, Blue Beam, Synchro and Tekla Structures,

Looking ahead, our summer dev list includes adding new question types and survey options. Followed by individual user dashboards. We'll keep you updated as this work progresses.

R

Tuesday, 14 January 2014

A Decade of Skills Assessment...



At the end of 2013, we celebrated our 10th anniversary, on our journey to develop innovative skills testing solutions for the AEC industry.

Here's a (slightly nostalgic) trip down memory lane, looking back at a decade of skills assessment...

Way back in 2003, we launched a start-up called CADTEST.  Here's our old logo:


This was a plugin software tool, based on the AutoCAD API, which tested basic CAD skills across the following 10 areas (for a simple percentage calculation):
Lines/Sheet Setup & Xrefs/Circles/Text/Blocks/Dimensions/Integration/Layers/Variables/UCS.


We continued developing this application throughout 2004, but eventually realised that we would need more detailed results analysis, if the concept of skills testing was going to go any further.

We also experienced considerable resistance to the 'test' part of the brand, so a new name was launched in 2005 - CADsmart.  We re-wrote the software from the ground up and launched a new application, which offered greater functionality for capturing and analysing results data.


We swiftly progressed our technical offering by adding MicroStation skills testing, plugging into the Bentley Systems API - another industry first.


One of the most important decisions we made early on, was agreeing to focus solely on skills testing, not training, hiring or consultancy services.  And we took an independent approach to recommending training. 

By 2006, we were doubling new sales each year and carrying over 90% renewals.  A brand refresh followed, with a new logo and marketing campaign.



We developed a more sophisticated online admin dashboard, with charting tools, screen recording, 100% movies and detailed user feedback.




In 2007 we introduced performance quartiles, more detailed benchmarking statistics, an 'Xpress' test for interviews, a candidate movie and a web based booking system.  We also ran our first 'CAD Focus Group' meeting, a popular forum for gathering user feedback and software wishlist ideas, which continues to the present day.




By 2008, the world was feeling the effects of the coming 'credit crunch' (read: recession) and the AEC industry was starting to embrace the concept of BIM with greater enthusiasm.  We opened our US office, in sunny Clearwater, Florida.  And we hosted our 'Cutting Edge CAD Management' conference in London.


In the same year, we launched our first interactive Revit Architecture skills test and hosted our first AUGI Top DAUG skills contest at AU, in partnership with 4D Technologies.  We also presented at the Bentley conference in Philadelphia and attended our first RTC conference in Melbourne, Australia.


By 2009, the global recession was taking hold and AEC firms were turning away from hiring and training in large numbers.  This was probably our most challenging year to date.  But it also proved to be a turning point on our journey. With BIM adoption gathering pace, we realised that the plug-in technology we had developed was in danger of becoming obsolete. 'The Cloud' beckoned, and with it the ability for firms to customise their test experience and cover a much broader range of test topics. Our original approach of, 'any colour you want (so long as it's black!)' was not going to allow us to grow much further.

As is often the case with technology, our original ideas were being overtaken by the growth of internet technology and the broad range of software being deployed across the AEC industry. One door was closing, but another one was opening... and the KnowledgeSmart brand was born.


Once again, we found ourselves developing new technology from scratch, but we had 6 years of invaluable ideas to build into our new platform. Plus a loyal customer base and extensive network of authors.

In early 2010, the KnowledgeSmart web based test engine was launched.  And in the summer of 2010 the original CADsmart software was finally retired from service.

Fast forward 4 years and we arrive at the present day, with a fully developed, interactive, web-based test engine, complete with a customisable library of 60+ software assessments (covering the most popular software products from Autodesk, Bentley, Graphisoft and Adobe) and a customer base numbering in the hundreds of firms, right across the globe.



And the funny thing is, in spite of everything we've built over the past decade, all the presentations, meetings, overseas trips, conferences, competitions, highs and lows, in many ways, we're only just getting started on our journey to create a world class skills assessment system.

We can't wait to see what the next 10 years will bring!

R

Saturday, 11 January 2014

Knowledge Trax - A New Approach



This is one of the best approaches to skills assessment and custom training that we have seen.  It's called Knowledge Trax and it combines best practice for assessment, online & class based training and service delivery.

Knowledge Trax breaks this process down into 6 simple steps:

1. Assess
2. Analyze
3. Prescribe
4. Educate
5. Re-assess
6. Refine

Here's a detailed summary of the Knowledge Trax process, to help customers measure and improve performance and productivity.

Discovery
Discovery begins with a kick-off meeting where we help you identify technology, workflow, and organizational structure, along with staff roles and responsibilities.

Next, we’ll conduct a skills assessment, which includes assessment management, building role-based assessment or creating custom assessments based on your needs, and develop custom questions based on your company’s workflow. Our management of this process ensures that all required assessments are conducted and you are able to track who within your organization has and has not taken the assessments.

Planning
In this step, we will analyze your assessment results and build an education plan that combines instructor-led classroom training, instructor-led online training, and pre-recorded self-paced online training. We will then deliver and review this plan with you.

Execution
In the Execution phase, the Knowledge Trax team delivers the training based on the approved education plan. And because these plans are based on the unique needs of your company’s staff and roles, your team may be involved in instructor-led classroom training at one of our facilities or online training.

Maintain
In this phase, we conduct an initial post assessment which will identify areas for ongoing training needs, ongoing refresher training, or gap training. Based on your staff’s individual needs, we’ll deliver the training in the method that best fits them.

Take a look at this video which tells you more about the Knowledge Trax process.

Knowledge Trax partners with KnowledgeSmart and Global eTraining to deliver a complete customer experience.

R

Thursday, 19 December 2013

Results Analysis - A Detailed Overview



The main point of KnowledgeSmart is to provide AEC firms with detailed skills gap and training needs analysis data.  So let's take a proper look at the process.

To begin, we will assume that you have already gone through the process of adding users, inviting users, assessing your teams and you now have a healthy selection of results in your KS dashboard.

Don't forget, you control the amount of feedback that users get to see at the end of their test session, by selecting your preferred option on the Settings > Test Feedback page.


And you can also choose to display (or hide) the 'Skip Question' (I don't know the answer) button, the 'Request Training' (I'm not sure about my answer) button and the test logout button, via the Settings > Test UI Options page. This will have an impact on the types of results data you are able to capture and analyse later on.



Last, you can choose to display (or hide) the user data capture pages, which appear at the start and end of a test session. We will be adding some new search values shortly, which allow you to create reports on user profiles and demographic data.



You can monitor the status of your invites (Not started, In Progress, or Completed) on the Invites > History page.


For the purposes of this exercise, we're using dummy results data, to protect the (incompetent?) innocent!

When your teams have completed their assessments, you can view your KS results in the Results > Data area of your dashboard.


Use the 'Results per page' drop-down to change the number of results displayed on screen.


Use the account drop-down menu to view results data across linked accounts.


Click on the score link to view a detailed summary of each test report. The report opens up in a new window.


You can display links to in-house or third party learning resources in the test report, which builds a bridge between the testing and training parts of your learning journey.



Creating Groups

You have a range of searching & grouping tools available, for filtering and analysing your KS results in detail.

Step one
Select the 'Show search' link in the orange bar, at the top of the Results page.


There are a variety of different search parameters. You'll also see the five user datafields on the right hand side of the search box (you can add your own custom labels on the Settings > User Datafields page). Use these values to add more depth to your results searching & filtering options.


Step two
Enter the information in the relevant search field and hit 'Search' to filter your results data. Use the check boxes to select your results records for the new group.


Selecting the 'Training requests' filter will display a list of all users who have selected the 'Request training' button during their test session.


Click on each report to see which questions have been flagged.


Step three
Select the 'Show groups' link in the orange bar.


Step four
Use the 'Create New Group' tool to create a new sub-group of your results, based on the output of your search.  Once you have selected your results, type in the name of the new group and hit 'Create'.



Step five
Use the 'View group' drop-down menu to navigate between sub-groups of your results data.



Dynamic Groups

Use the dynamic grouping tool to create new groups which will automatically update, when new test results are added to your dashboard.

Step one
Select the 'Show dynamic groups' link in the orange bar.


Step two
Use the 'New Group' tool to create a new dynamic grouping of your results.  Enter the group name in the field provided.


Step three
Use the 5 datafield drop-downs to create the rules you require, for adding existing and future results to your group.  Save your new dynamic group, by selecting Save Group.
For example, a results group has been created based on datafield 1, City: London.  The next time a new result is added to the database and the associated user record includes datafield 1, City: London, then the results record will automatically be added to the existing group.


Step four
Use the 'View dynamic group' drop-down menu to navigate between sub-groups of your results data.



Exporting Results

You have a variety of options for exporting your KS results data.

Select the 'Export results to csv' button to generate a spreadsheet of overall results data, including a personal curriculum of training topics for each user.



Select the 'Export training info to csv' button to generate a spreadsheet of training workshops, together with a corresponding list of users who have been identified as potentially needing to attend each workshop.



Select the 'Export training requests to csv' button to generate a spreadsheet of users who have flagged questions for further training. The report lists the highlighted questions and corresponding training tags.



Select the 'Export skipped questions to csv' button to generate a spreadsheet of users who hit the 'Skip question' button during their test session. The report lists the skipped questions and corresponding training tags.




Charting & Reporting KS Results

There are a range of charting and management reporting options available in your KS dashboard.

Performance Scatter

This chart displays group results in performance quartiles.  The upper left quadrant (Q1) contains results, where the users completed their assessment accurately and fast.
The upper right quadrant (Q2) shows results which have high accuracy, but slower completion times.  Bottom left (Q3) represents lower scores, but in a fast time.
Lastly, the bottom right quartile (Q4) shows test scores which are inaccurate and slow.



Training Requirements

This chart highlights training topics for a given group of test results. The logic analyzes all of the results for a group, references the training tags assigned to questions presented during a test and lists those tags in priority order.
Red indicates the tasks which have been answered incorrectly by most people in a given group. Orange is the next highest priority, followed by Yellow; green training tags are the topics which have been answered correctly by most of the group, so represent the least urgent issues.
For example; 10 people are presented with one or more questions, which include the tag, ‘Rooms’. If 7 of the 10 people answer one or more ‘Rooms’ questions incorrectly, then the keyword, ‘Rooms’, will be flagged at 70% and appear in Red on the chart.


You can also view charts for Training Requests and Skipped Questions on this page.




Question Performance

This chart looks at how each individual question in your library has performed, in any given test.
The logic analyses all of the results for each test and presents an aggregate percentage score for each question, divided by the total number of results for that test on the account.
For example; 10 people answer a question called ‘Doors’. If 7 of the 10 people answer the question 100% correctly, 1 person scores 50% and the other 2 score 0%, then the question will score an aggregate of 75% and appear in Yellow on the chart.


You can also create a chart for the average time taken to answer each question, by changing the radio button at the foot of the chart.



You can quickly view the relevant question data by clicking on each question name in the 'Question List' box, on the right hand side of this page.


Select the 'Export question performance info to csv' button to generate a spreadsheet which presents a breakdown of per question time & score values for each user.



Select the 'Display question answer info' button to view a table of 'required' vs 'submitted' answer values for each user. This is an easy way to identify an common mistakes or errors across multiple results.



We will shortly be adding a new page called 'Module Performance', which displays a per module breakdown of KS results data.


Group Scores

This chart displays user performance for any given group, in descending order. The X-axis shows the % score attained and the Y-axis displays user names.



Group Comparisons

This chart allows firms to compare performance, from one group to another, across (up to) 9 sets of data at a time.  Group vs group comparisons can be used to compare a range of results data.
For example; pre and post-training performance, different project teams, offices from different geographic locations, data representing different job titles or industry disciplines, in-house data vs interview candidates, and so on.



Global Comparisons

This chart allows firms to compare in-house performance against a wider set of anonymous industry aggregate results data, for selected tests.


We will be adding new searching and reporting options in a future release, so you can analyse the user background info and demographics in greater detail. Regional benchmark stats and user profiles are a key theme for the global comparisons section.

So, there you have it, a comprehensive strategy for capturing, analysing and reporting on skills gap and training needs data for your organisation.

R