Thursday 31 March 2011

KS Test Library Update

It's fair to say that the KS test library is a work in progress.  It has also been mentioned (on more than one occasion!) that creating up to date test material for the plethora of design, engineering, modelling, civil, structural, building services and analysis tools used by AEC firms - is a job for life! :)

However, we like a challenge, so ever onwards do we march with our library roadmap!

Progress to date has been very good, with almost 30 test titles currently available to AEC firms, across a range of leading vendors.

Here's an up to date list, as of the end of March:


  • 3ds Max – modules 1-4
  • Adobe InDesign for occasional users
  • Adobe Photoshop for occasional users
  • AEC Layers Standards – Autodesk
  • AutoCAD 2D fundamentals
  • AutoCAD 2D – Xpress
  • AutoCAD 2D for occasional users
  • AutoCAD Civil 3D (part 1)
  • AutoCAD Civil 3D (part 2)
  • Design Review fundamentals
  • Revit Architecture fundamentals
  • Revit Architecture – Xpress
  • Revit MEP fundamentals (Mechanical)
  • Revit MEP fundamentals (Plumbing)
  • Revit Structure fundamentals
  • Revit Structure – Xpress
  • AEC Levels Standards – Bentley
  • Bentley Architecture fundamentals
  • Bentley GEOPAK (Drainage)
  • Bentley InRoads (Roadway Design)
  • Bentley View fundamentals
  • MicroStation 2D fundamentals
  • MicroStation 2D – Xpress
  • MicroStation 2D for occasional users
  • MicroStation 3D fundamentals
  • Rhino fundamentals

We also have a pretty hectic 'work in progress' list, with a variety of new modules at various stages of completion.

Here's what's on the drawing board at the moment:


  • Revit Families 
  • Revit Project Process 
  • USACE ‘Attachment F’ 
  • Ecotect 
  • Navisworks Manage 
  • Revit Architecture 2012 (additional questions) 
  • Revit MEP fundamentals (Electrical) 
  • AutoCAD Civil 3D (part 3)
  • Revit MEP 2012 (additional questions) 
  • Google SketchUp 
  • AutoCAD Architecture
  • Revit Structure 2012 (additional questions)
  • Bentley ProjectWise
  • Bentley Navigator
  • MicroStation V8i (Select Series 1-3)
  • IES Virtual Environment
  • Tekla Structures
  • Bricscad
  • Codebook

So by the time we get through that little lot, we'll have close to 50 titles in the KS library.  Well, that's our goal for the end of 2011, anyway!

In addition, we're seeing quite a bit of activity in customer dashboards, with some high quality custom test material being authored in many AEC firms.

So, that's a quick summary of where things currently stand with our test library.  As always, we welcome your comments, feedback and requests for new material.  If it isn't on the list, please let us know and we'll make sure it gets added to the workflow.

R

Wednesday 30 March 2011

Customising your Test Invite Text

KS admins have two main options for adding custom text to their test invites.

Option one
The default invite email text appears in the 'Invites' page of the KS dashboard.


From this page, a KS admin can overwrite the default text, in the 'Email subject line' field and in the 'Your message' field, on a per invite basis.  You have a maximum of 800 characters for your test invite text.

Option two
Alternatively, KS admins can write their own custom invite text, which overwrites the KS default text.  To do this, go to Settings > Test Invite.


Here, a KS admin can enter their custom invite text and email subject header.  You have a maximum of 800 characters for your test invite text.  Click 'Save' to register the changes.  This custom text will now replace the KS default message on all test invites sent from the account.


The custom invite text appears on the main test invite page.


To revert to the default KS invite text, at any time, a KS admin simply has to delete the custom text from the Settings > Test Invite page of the dashboard.

R

Building your List of Users

KS admins can upload a list of users into their dashboard and capture additional data against each user record, which can later be used for searching and grouping test results.  Here's how the process works:

Step one
Log in to your admin dashboard and select 'Users' from your navigation tools.


Step two
Click on the 'Upload User Data' link and take a look in the 'Help' notes box on the right hand side.  Tucked away in paragraph 2 is a link to a sample csv file, which you can use to build a list of your users.



Step three
Copy your user info from Excel, into your csv file.  You need the following fields; Email, First name, Last name.  You can also capture user status and 5 additional data fields for information of your own choosing.  For example, employee ID, project team name, office location, job title, industry discipline, etc.  Think of information you might like to use, when searching and filtering your results data later on.


Step four
Browse for your completed csv file and use the 'Upload' tool to add your user list to your KS dashboard.



Step five
Click on the magnifying glass icon, next to the user name (far right of the row), to add to or edit the 5 user datafields.


That's it - you've built your KS user list!  If you subsequently upload a new csv, with additional user records, then the new list will merge with the existing user list.

Searching your user datafields
You can use the information captured in these 5 datafields, to filter your results data in the 'Results' area of your dashboard.  Select the 'Show search' link in the orange bar, at the top of the Results page.  You'll see the five datafields on the right of the search box.


Enter the information in the relevant datafield to filter your data, then select the 'Show groups' link in the orange bar, to create new sub-groups, based on the content of your user datafields.


Batch inviting
Once you have built your user list, you can invite groups of people at the same time, to take a test (provided they are all taking the same test!).  Check the boxes next to the names of your invitees and hit the 'Invite selected users' button at the bottom of the page.


This takes you to the test invite page and pre-populates your user fields.  It is important to remember to select the test you want from the library dropdown at the top.  The default list displays A-Z, with 3ds Max at the top. (If you forget to change the test title, before sending your invites, you may have some confused users asking why you're asking then to take a 3ds Max test!  Unless, of course, that's the one you wanted! :) ).

When you're happy with your invite text, just hit the 'Send Email' button to send your batch invites.


R

Test Feedback Options

KS admins can set different levels of feedback for users at the end of a test session.  It is also possible to select different levels of feedback for interview candidates and in-house users.  Sometimes firms don't want interview candidates to receive a test report immediately after their test session.  So now you have the choice of when to provide user feedback - and just how much information to share.

It's a straightforward process to choose your feedback settings:

Log in to your KS dashboard and click Settings > Test Feedback.


There are 4 choices:

1) No report
2) Basic report
3) Feedback on report and
4) Feedback & coaching on report.


The default setting is as follows:

Interview candidates - basic report


In-house users - full report with feedback and coaching notes


When you've made your selection, hit 'Save' to register your changes.  This setting applies to all tests taken on your account.  If you want to adjust the amount of feedback for different groups, just amend the setting in your dashboard, before you send out the test invites.

R

Branding your Test UI and Reports

We recently added a light touch of corporate branding for KS customers.

It is now possible to replace the KS logo on all user-facing test UI's and reports, with your own company logo.  We recommend that you do this, as it makes the whole experience for users feel more like an in-house training needs analysis exercise, rather than a scary third party skills test!

It's a really simple process..

Step one
Choose a copy of your company logo.  Keep the file size sensible, as it is designed to fit in a comparatively small space, at the top of your admin dashboard, test UI and test reports.  About this size should do just fine.


Step two
Log in to your admin dashboard and select Settings > Branding from your navigation tools.


Step three
Use the browse and upload tools to select your logo and upload a copy to the KS dashboard.


Step 4
Check the boxes to display your logo in 3 places; your KS dashboard, your test UI and your test reports.


Click the 'Preview' buttons, to view how your logo will display in the test UI and test report.  You'll probably note that we've removed the KS orange colouring from the test reports, replacing it with a more neutral light grey - to avoid any unfortunate clashes with your own colours!

Sample Test UI


Sample Test Report



We're gradually dialling back on the KS branding, in favour of your own corporate colours.  In future, you'll see a smaller, 'Powered by KnowledgeSmart' logo, appearing at the footer of most of our collateral.


R

Setting up Test Sessions from a Browser

It's not always possible for users to access a test invite via email.  For example interview candidates, or in-house candidates who are using a training room, where access to their email account is not available.

For this reason, we have introduced a new feature, which allows KS administrators to set up a new test session straight from a browser.

Here's how it works:

Step one
Admin selects the link called, 'Admin Test Setup Login', on the dashboard landing page.


Step two
Admin enters username and password to access their account.



Step three
Admin selects account from the dropdown (for firms with multiple accounts) and selects the required test title from their library dropdown.  This list includes KnowledgeSmart OTS (off the shelf) tests and self-authored tests, displaying in A-Z order.  Test ID numbers are also included, so admins can easily identify which test they want from their library.


Step four
This is where the candidate takes over.  They enter their name and email address, select their status (employee or interview candidate) and proceed to the test landing page.


Step five
This is the test landing page.  This is the same page as user's see when they click on the test URL in their email invites.  From here, the process for taking a test is the same, irrespective of how they were invited.  When they have read their test instructions and downloaded their sample data files, they hit the green 'Start' button to begin their test session.



R

Thursday 24 March 2011

Creating and editing KS tests

AEC firms can use the KS system to create their own skills assessments from scratch.

Let's take a brief look at the test creation journey.  We have built an intuitive, 4-step process, for writing and editing custom tests.

To edit an existing KS test, first hover your mouse over the 'Edit test' button, in the 'Your Tests' page of your admin dashboard.


Note that a small pop-up menu appears to the left of the 'Edit test' icon.  This menu allows you to skip straight to the relevant step of the edit process, depending on the change you want to make.

Step one
is where you enter the test name and a brief description of the key features being covered by the test.


Step two
is where you assign modules from your library, to be linked together in a 'daisy-chain', creating a longer test.  Use the green '+' icon, to add modules to your test.  Conversely, you can remove modules by selecting the delete icon next to the module name, in the 'Your modules' panel on the right of your dashboard.


Step three is where you choose how many questions you want to be assigned in each module. The 'Available questions' column displays the total number of questions in the library 'pool'.  The 'Assigned questions' column displays how many questions will be randomly presented to users during a live test. Click on the 'Edit test module' icon, to select the required number of questions in each stage, then click the green tick to save your changes.


Step four is where you can edit your test settings.  Here you can set your preferences for presenting the sample test files to users, set a time limit on test sessions, provide additional instructions to your users, hide or show the timer in the test UI and assign meta tags, or keywords, to each test.  Category tags are used to help you search and group your tests, back in your main library.


You can view a summary page for each test, by clicking on the magnifying glass icon, next to the test name, on the 'Your Tests' page.




KnowledgeSmart offers AEC firms a flexible solution for assessing the technical software skills of their teams. They can choose from a wide range of 'off the shelf' test titles, for design and engineering software.  And they can write their own test material from scratch.

R

Creating and editing KS modules

When building a KS test, we usually group questions by topic into modules.  Modules can be deployed individually as a short test, or, in turn, they can be linked together (in what we refer to as a 'daisy-chain') to create a longer test.

Let's take a brief look at the module creation journey.  We have built an intuitive, 3-step process, for writing and editing test modules.

To edit an existing KS module, first hover your mouse over the 'Edit module' button, in the 'Your Modules' page of your admin dashboard.


Note that a small pop-up menu appears to the left of the 'Edit module' icon.  This menu allows you to skip straight to the relevant step of the edit process, depending on the change you want to make.

Step one
is where you enter the module name and a brief description of the key features being covered by the module.


Step two is where you assign questions from your library, to be grouped together according to topic.  Use the green '+' icon, to add questions to your module.  Conversely, you can remove questions by selecting the delete icon next to the question name, in the 'Your questions' panel on the right of your dashboard.


Step three is where you assign your meta tags, or keywords, to each module.  Category tags are used to help you search and group your modules, back in your main library.


When you have finished creating or editing your modules, you can view a complete summary of all the key elements for each module, by clicking on the module name, back in 'Your Modules'.  This displays a green popup summary box - what we refer to as the module 'footprint'.



The KS system is very flexible.  Administrators can move questions around between modules and move modules around between tests.  You can use the KS content, 'off the shelf' (completely unchanged), you can edit the KS content and you can create test material from scratch.  Then use a blend of all this material, to build your own skills assessments for your firm.

Next, we'll look at the process for creating and editing KS tests.

R

Creating and editing KS questions

The KnowledgeSmart online skills assessment system offers a high degree of flexibility to administrators, in terms of the way they can customize their test library.

We work closely with specialist authors around the world, to create quality assessments for a range of design and engineering software titles.  These assessments can reliably be used 'out of the box', or, as we prefer to say, 'off the shelf'. (We don't actually have a box!).

But many firms prefer to edit the OTS content, to better match their own particular standards, workflows or project environment.  They also have tools to create their own test library material from scratch.

Let's take a brief look at the question creation journey.  We have built an intuitive, 5-step process, for writing and editing test questions.

To edit an existing KS question, first hover your mouse over the 'Edit question' button, in the 'Your Questions' page of your admin dashboard.


Note that a small pop-up menu appears to the left of the 'Edit question' icon.  This menu allows you to skip straight to the relevant step of the edit process, depending on the change you want to make.

Step one is where you enter the question name, a brief summary of the particular features being covered by the question and the question wording.


Step two provides you with the option of adding a file (or multiple files) to your task-based questions.  There are a number of file types which can be assigned to questions, for example, .dwg, .dgn, .rvt, PDF's, image files, movie files, and so on.  If you move questions around your library, from one module to another, the files move with the question.


Step three is where you input the answer to a question.  The fields vary, depending on the type of question, for example, free text, multiple choice, pick list, order list, or true/false question styles.


Step four is where you assign your meta tags, or keywords, to each question.  Category tags are used to help you search and group your questions, back in your library.  Training tags are used to help determine skills gaps amongst your users.  It is important to think carefully about which tags are assigned to your questions, as these will play an important part in your results analysis later on.


Step five is where you can view and edit the coaching notes for each question.  Coaching notes provide detailed feedback for users, on how to answer each question accurately and efficiently.  Coaching notes can be provided in a number of formats, for example; text only, text with accompanying screen shots, PDF’s with words & images, short movies, etc.



When you have finished creating or editing your questions, you can view a complete summary of all the key elements for each question, by clicking on the question name, back in 'Your Questions'.  This displays a green popup summary box - what we refer to as the question 'footprint'.


Here are some practical hints & tips for KS authors:

- A full question set is defined as follows; usually 18-25 questions, format to be determined by author, typically covering a range of core skills in a specified software application.

- A question set comprises live skills (task based) and theory (knowledge based) content. (Author to determine appropriate mix of questions).

- A task based question requires the candidate taking the test to have access to and use the software in order to elicit the answer.

- A knowledge based question can be answered without direct use of the software.

- You have 5 types of questions to choose from; Free text, Multiple choice, Order list, Pick list and True or False. Try to get a balance of different styles in your question set.

- When writing multiple-choice or pick-list questions, provide at least 5 answer options.  Try to avoid general answers, such as ‘All of the above’, or ‘None of the above’.

- Don’t write too many ‘True or False’ type questions – it’s too easy to guess the answers!

- Use screen shots and image files, or short videos, to enhance the effectiveness of some questions (i.e. PNG, JPEG, Camtasia, etc.).

- Think about the meta tags to be assigned to your qestions.  You will be making practical use of these tags when you analyze your test results and create groups and charts with your results data.

- Coaching notes should accompany each question. They are the author’s way of telling a user how they should answer a particular question accurately and efficiently.


Next, we'll look at the process for creating and editing your test modules.

R

Monday 21 March 2011

AEC IT Leaders Roundtable



KnowledgeSmart will be showcasing the latest release of our skills assessment tools for CAD, BIM, Engineering & Analysis software tools, at this year's AEC IT Leaders Roundtable.  The meeting takes place in San Jose, California, on 14 & 15 April.

KS CEO, Rory Vance, will be talking about the benefits of capturing performance metrics for a range of design and engineering software titles - and the shortcomings of firms who rely too heavily on self-assessment, as a means of identifying training needs for their teams.

For more information, go to: http://sites.google.com/site/aecitleadersorg.

Thursday 10 March 2011

New charting options

This month sees the launch of the latest release of the KS tools, with a host of new features to be found in the admin dashboard.

One of the key updates relates to the range of charting and management reporting options now available.

Here's a brief summary..

Search & Group Results
The first task for an admin, is to carve up the raw results data into more manageable blocks.  There are a range of searching and grouping options available, including; score ranges, date ranges, elapsed time, test topic, user names, training keywords, plus 5 additional custom datafields.



Performance Scatter
This chart displays group results in performance quartiles.  The upper left quadrant (Q1) contains results, where the users completed their assessment accurately and fast.  The upper right quadrant (Q2) shows results which have high accuracy, but slower completion times.  Bottom left (Q3) represents lower scores, but in a fast time.  Lastly, the bottom right quartile (Q4) shows test scores which are inaccurate and slow.  Hint: you don't want your core users anywhere near quartile 4!



Training Requirements
This chart highlights training topics for a given group of test results. The logic analyzes all of the results for a group, references the training tags assigned to questions presented during a test and lists those tags in priority order.
Red indicates the tasks which have been answered incorrectly by most people in a given group. Orange is the next highest priority, followed by Yellow. Green training tags are the topics which have been answered correctly by most of the group, so represent the least urgent issues.
For example, 10 people are presented with one or more questions, which include the tag, ‘Lines’. If 7 of the 10 people answer one or more ‘Lines’ questions incorrectly, then the keyword, ‘Lines’, will be flagged at 70% and appear in Red on the chart.



Question Performance
This chart looks at how each individual question in a library has performed, in any given test. The logic analyzes all of the results for each test and presents an aggregate percentage score for each question, divided by the total number of results for that test on the account. 
For example, 10 people answer a question called ‘Lines’. If 7 of the 10 people answer the question 100% correctly, 1 person scores 50% and the other 2 score 0%, then the question will score an aggregate of 75% and appear in Yellow on the chart. 



Group Scores
This chart displays user performance for any given group, in descending order. The X-axis shows the % score attained and the Y-axis displays user names.


Group Comparisons
This chart allows firms to compare performance, from one group to another, across (up to) 9 sets of data at a time.  Group vs group comparisons can be used to compare a range of results data.  For example; pre and post-training performance, different project teams, firm offices from different geographic locations, data representing different job titles or industry disciplines, in-house data vs interview candidates, and so on.


Global Comparisons
This chart allows AEC firms to compare in-house performance against a wider set of anonymous industry aggregate results data, for selected tests. Currently, we are publishing results for; AutoCAD 2D, MicroStation 2D and Revit Architecture.  When some of our other test titles accumulate a sufficient body of results data, we will publish additional benchmark figures.


This release is scheduled for go-live mid-March.  The new range of charting and reporting options should make identifying skills gaps and training needs much easier for CAD & BIM administrators and training and learning development professionals.

We'll be revisiting charting options again later in the year, so if you have any additional requests for chart styles and reports, please let us know and we'll add them to the Summer/Fall roadmap.

R