Quantcast
Channel: The KnowledgeSmart Blog
Viewing all 112 articles
Browse latest View live

KS Help Notes_Invites 04_Invites History

$
0
0
Admins can view a full list of sent invites, including test status, on the Invites > History page.


Use the account dropdown menu, to view individual accounts or all linked accounts.


Use the Filter by invite status tool to sort the view for all sent invites, into 3 groups: Not Started / In Progress / Completed.  


Search your invites by selecting the ‘Show search’ link and entering user name, test name or invite sent date in the relevant field.  Then hit the ‘Search’ button to filter your data.


Sent invites data can be exported to CSV for further analysis.


Re-send multiple invites at the same time, by selecting the relevant invite(s) using the check box(es) and clicking the Re-send invites button.


Delete multiple invites at the same time, by selecting the relevant invite(s) using the check box(es) and clicking the Delete button.


Change the expiry date for one or more invites, by selecting the relevant invite(s) using the check box(es) and clicking the Extend invites button.


Use the Resume Test Session tool, to log back into user test sessions.


R

KS FAQ's

$
0
0
Here are the most commonly posed questions, about the KS admin dashboard and assessment journey.

Q - What's the difference between a Sub-admin and a Full-admin?
A - Sub-admins can set up test sessions from a browser, using the'Admin Test Setup Login'link, but not access the main KS dashboard. Full-admins can log in to the KS dashboard and use the browser test setup tools.

Q - How do I re-set or change my KS password?
A - All users can re-set their KS password at any time, by clicking the'Forgot details' link on the KS dashboard login page, which takes you here. Enter your KS username and the system will email a new password.
KS admins can set their own password, by logging in to their KS dashboard and selecting the 'Change password' link. Copy the original password into the first field, then confirm your own password and save the changes.

Q - How do I add a new KS administrator to my account?
A - Go to the Accounts > Your Accounts page of the KS dashboard and click on the 'Administrators'tool (small people icon). From here you can add new Sub-admins or Full-admins, using the 'Add administrator' fields.

Q - I get an error message when I try to import my user list. Why is this?
A - First, make sure that you are using a .csv file format, not .xls or .xlsx. (See help notes on the Users > Upload User Data page for a link to a formatted .csv template). Also, make sure that you have 10 data columns in your .csv file.

Q - Do I need a new username and password for the KS Support Ticket System?
A - Not really. You can use the same login details as you use to log in to the main KS dashboard, or you can choose a different username and password to access the support ticket tools. Either will work OK.

Q - My invite mails are not getting through to users every time. Why not?
A - Sometimes KS invite mails are blocked by spam prevention software. If the KS mails are not in junk mail, then your corporate security might not be allowing the mails through. This happens from time to time. A combination of the following steps usually resolves the issue:
‘White flag’ the domain @knowledgesmart.net.
‘White flag’ the domain @ks-server.net.
‘White flag’ mails from system@ks-server.net.
That should allow the mails to work OK. However, if they are still getting blocked, there is one more thing to try. Our hosting company (1&1) sometimes uses a relay server in Germany, called kundenserver.de.
‘White flag’ the domain @kundenserver.de.
That should sort everything out, from a corporate mails point of view.

Q - Some invite mails contain a username and password. Others do not. Why is this?
A - The KS system appends a username and password on invite mails for regular users (i.e. non-admins). If a user receives two or more test invites in close succession, the latter invite(s) will overwrite the password(s) from earlier mails. KS admins can set their own password, so the KS system does not include admin usernames and passwords on test invite mails.

Q - I can't log into my test session. A message refers me back to the system administrator. What's happening?
A - This could be a number of things. Check that the test invite doesn't have an expired date that has passed. Check that the test doesn't have one or more modules with '0' in the 'assigned questions' box (i.e. an empty module). Check that the original invite hasn't been deleted, by viewing the Invites > History page.

Q - What happens if a 'Whoops Gremlins!' message appears during a test session?
A - If you lose a network or web connection during a test session, an error message might display. This sometimes happens as you move between questions. Click the back button once to return to the previous screen and wait for a few seconds. The test session should resume as normal.

Q - I am seeing one or more error messages onscreen during a test session, when I navigate between questions. This is not caused by loss of web connection. What else could it be?
A - If the same question has been assigned in a test more than once, this can cause an error message to display. Check to make sure that you have not duplicated the same question in one or more modules. (Quick tip: when assigning questions to modules, use a single mouse-click. A double-click can cause the same question to display twice).

Q - I can't answer the 'order list' questions in my test session because the drag & drop feature isn't working. Why not?
A - We support the following browsers: IE8, IE9, Google Chrome, Mozilla Firefox and Apple Safari. We are aware that the 'order list' type questions occasionally cannot be completed in IE6 or IE7. There is an easy workaround. Simply log out of the session, then copy and paste the original invite link into one of the other browsers listed (or use the 'Resume Test' tool in the Invites > History page of your dashboard). Enter your user details and click the 'Resume' button to log back in to the session (all existing answers will be saved). Complete the remaining question(s) and click 'Finish', then click the link to view the test report.

Q - If I log out of a test session, will my answers be saved?
A - Yes. All your answers will be saved if you resume a test session, at a later date.

Q - My test result did not upload correctly at the end of a test session. Can it be rescued?
A - Yes. If you lose your web connection at the end of a test session, occasionally a result will not upload to the dashboard successfully. Just raise a support ticket and the KS team will manually retrieve the score and add it to your dashboard. Your work will not be lost.

Q - How can I tell which version (ID number) of a test someone took? For example, I have two different Revit fundamentals tests, but I don’t know which one a specific user took.
A - When the invite mail goes out, the test ID is appended after the test name. Results mails also include the test name and ID number.
If you can’t locate the original test invite mail, you can track individual question ID numbers in the test report at the end of the test. Select a question ID from the report (any question will do), then go to Library > Draft Content > Draft Questions and enter the question ID number into the ‘Search Questions’ box. Hit ‘Search’ and you’ll filter the question library, to display the question which carries the relevant ID number. Click on the question name and view the green popup panel. About a third of the way down, you will see a section called ‘Used In (Modules)’ and another called ‘Used In (Tests)’. Here, you can read the ID number of the parent module and/or parent test, for that question.

R

KS Help Notes_Library 10_Precision on Free Text Questions

$
0
0
KS admins can determine answer precision on free text type questions.  To edit a question, go to Library > Draft Content > Draft Questions and hover your mouse over the 'Edit question' icon. Select 'Provide the answer' in the flyout menu.


If an answer is numeric, admins can allow a variance in the value presented in the answer field by users, assigning full or partial marks for 'nearly right' answers.


Admins can provide user hints for answer precision (i.e. the number of decimal places required in the 'correct' answer).




For non-numeric answers, we have included the option to disregard incorrect case and white spacing in the answers submitted by users.  So, for example, if the 'correct' answer to a question was 'Top DAUG', but a user entered an answer of 'Top  daug', the admin can decide whether to assign full, partial or no marks.



As a default setting, KS OTS (off the shelf) tests assign full marks to answers with minor variance in case and white spacing.  Precision hint and variance for numeric answers is not enabled.

R

AUGI Top DAUG 2012

$
0
0


The AUGI Top DAUG competition was introduced at Autodesk University, back in 2000. The contest originally featured AutoCAD 2D knowledge. This year, once again, AUGI is teaming up with KnowledgeSmart to expand the range of topics to include 8 software tracks.

Here are some guidelines for this year's contest..

All contestants must attend Autodesk University in order to compete.

The contest includes modules based on the following software versions:

-3ds Max 2013
-AutoCAD 2013
-AutoCAD Civil 3D 2013
-Inventor 2013
-Navisworks Manage 2013
-Revit Architecture 2013
-Revit MEP 2013
-Revit Structure 2013

Contestants may participate in multiple modules, but only take each module once.  (Any contestant taking the same module more than once will be disqualified).

A combination of test scores and fastest times determines the winner of each module and the overall winner of the competition.  In the event of a tie, the winner of the contest will be decided by a coin toss.

The following are not allowed to be used in the Top DAUG contest area:

-Cell phones, smart phones or any electronic devices
-Reference books and/or websites
-Autodesk materials, training courseware or related materials
-Personal 1 on 1 coaching from Brian Mackey (just kidding, Brian! ;) )

A range of prizes will be awarded for the best performers.  See you all at the Mandalay Bay, in a couple of months!

R

KS user passwords

$
0
0
This query comes up from time to time, so I thought it worth documenting on the KS blog.

Some invite mails contain a username and password, but others do not.  Why is this?

The answer is pretty straightforward.  The KS system appends a username and password on invite mails for regular users (i.e. non-admins).  Further, if a user receives two or more test invites in close succession, the latter invite(s) will overwrite the password(s) from earlier mails. Which means they need to use the most recent password for all active test sessions.

All users can re-set their KS password at any time, by clicking the 'Forgot details' link on the KS dashboard login page, which takes you here.




Enter your KS username and the system will email a new password.


System admins can set their own password, so the KS system does not include usernames and passwords on test invite mails for administrators.

KS admins can set their own password, by logging in to their KS dashboard and selecting the 'Change password' link.


Copy the original password into the first field, then confirm your own password and save the changes.


R

KS Help Notes_Invites 05_Setting up a Mock Test

$
0
0
Occasionally, KS firms ask if it is possible to set up a practice or mock test environment for users, so they can familiarise themselves with the KS test UI, format, question types, file download process, and so on, before taking a test 'for real'.

The answer is yes.  And it's pretty easy to set up.  Here's what you need to do...

Step One
Set up a Practice Test Account

First, set up a new account, called 'XYZ Engineers Practice Account/Sand Box/Mock Test Account', or similar.

See this link for more detailed notes on how to set up a new KS 'child' account.  Don't add any KS OTS tests when setting up this account.  Leave the KS test list blank for now.

Next, set up a new Full-admin profile for your new account.  Go to the Accounts > Your Accounts page of your dashboard and select the 'Administrators' icon next to your new practice test account.


Use the 'Add administrator' fields to add a new Full-admin to your account.


Use a general user name, i.e. XYZAdmin and a 'catchall' email address, such as training@xyzengineers, learning@xyzengineers, or similar.

Look out for the system mail with your new admin's login details.  Or, alternatively, re-set the password on your new admin profile.


Log in as your new administrator.  Now, hit the 'Change password' link, to create a new generic password, i.e. XYZ123.  These login details will be made available to all users, so should be easy to remember.





Last, change your new admin profile from Full-admin to Sub-admin status.  Don't forget, a Sub-admin user profile can only access a KS test from a browser.  It cannot be used to access your main KS dashboard.


Now that you have set up your new practice test account, you need to create a mock test, for your users to log into.


Step Two
Set up a Mock Test

Choose a suitable topic for your mock test, i.e. AutoCAD, Revit, Civil 3D, MicroStation, etc.  Log into your main KS 'parent' account and create a new practice test from scratch.  See this link for more detailed notes on how to create your own KS modules and tests.  See this link for more detailed notes on how to import and edit a KS OTS test.

Set up your new test so that you only have a single module, with 3 or 4 sample questions presented in your mock test.  Save your changes and publish your new test.  See this link for notes on how to publish draft KS content.

Now, go to the Accounts > Manage Content page and copy your new mock test across to your practice test account.  See this link for notes on how to copy published tests from a 'parent' account to a 'child' account.

Now, any user can log in to a mock test session, using the generic Sub-admin profile, to get a preview of what a live KS test session looks and feels like.

You might want to create some simple user instructions on how to log in to a KS test from a browser.  This link will give you some useful info to share with your team.

R

KS Devs Update

$
0
0
As Q4 rolls around, we thought it would be useful to jot down a brief summary of what we're working on at the moment.

KS Library

All of the existing (Autodesk) library titles have been updated to 2013 format.  Our favourite task of the year. (Not really).  This includes: Revit (x3), AutoCAD, Civil 3D, Navisworks, Design Review and 3ds Max.

We're over half way to finishing an expanded Inventor 2013 question set.  Working with ace US author, John Evans, we are covering 4 key areas: Assemblies, Part Modeling, Drawings (and presentations) and Welding and sheet metal.  Our goal is to complete the set in time for AU2012.

We're compiling a larger set of general BIM related questions, for the KS Community area of the library.  These will be a FREE addition to the KS library.  Should be ready in time for AU, or thereabouts.

We have started work on a set of ArchiCAD fundamentals questions - our first foray into the Graphisoft world.  Covering 4 levels of difficulty, this set will be available in the New Year.

Advanced RAC, advanced RST and Revit for occasional users are all works in progress. Our deadline for these sets has slipped a bit, due to author commitments, but we're aiming for something on all 3 fronts by AU2012.

USACE has put together a working group to check the Attachment F assessment material.  We're awaiting the green light from the Corps, to release this free question set into the general KS library.

On the drawing board (can we still use that phrase?) for the next round of updates, include the following topics: Advanced RMEP, advanced Civil 3D, Bentley ProjectWise, AutoPlant and Plant 3D.

A variety of vendors are currently discussing writing test material for their own software products.  These include: CADS (CADS RC), Tekla (Tekla Structures), Synchro (Synchro Professional), Savoy (AutoTrack) and Codebook (Codebook).


KS System

The latest release of the KS tools hit the live site in early October.  This completed our program of works for the summer, within two weeks of our original goal, which is excellent.

Next up, we have two point-releases in the works, which address the remaining items on the KS user group wish list.  These include: capturing additional user data for more detailed statistical & benchmarking analysis; update to the test report format; additional library searching & grouping; tidy up of the users data page (including option to edit and delete master records); prep for supporting different languages in the test library;  update to account hierarchy options (including option to delete and move accounts); new charting option for flagged training request data, plus a few other smaller tidy-up tasks.

We'll also be moving to a new cloud server next month, which gives us a greater capacity to grow.  We anticipate a small amount of down-time, during the move, so we'll keep you updated.

Looking ahead, we'll be discussing the following additional features with our user group, in the coming months: User dashboard pages, new question types, survey questions & charting, links to third party systems, more detailed regional benchmark stats, additional charting & reporting options and a web based skills matrix.


Out and About

We'll be attending a few events in the next couple of months.  Autodesk host their annual BIM conference, in London, on 19 November.  The same week sees a free BIM learning and networking conference, hosted by Ramboll, in London, on 22 November.

Plus, KS is once again co-hosting the AUGI Top DAUG skills contest at Autodesk University.

So hopefully we'll have a chance to cross paths at some point this quarter.  Once again, thanks to our ever-enthusiastic user group, for a steady stream of ideas for new system features and library additions.

R

KS server update

$
0
0
Quick technical heads up for this Sunday. We're moving the KS system from the existing Rackspace local server to a cloud based server. Nothing especially earth shattering. But an important update to our back-end hosting, which sets us up well for next year and beyond. The nice thing about cloud based hosting is the scalability. If we need to ramp up our RAM or increase overall capacity in future, then we'll have greater capability to do so.

So for 2-3 hours on Sunday 09 December, around 12pm EST / 5pm GMT, you might experience a little down-time in the live system.  We'll do everything we can to minimise the impact of this work.

R

AUGI Top DAUG 2012 - The Results

$
0
0


Building on the success of last year's contest at AU, KnowledgeSmart and AUGI once again teamed up to provide an interactive skills assessment, across 8 popular Autodesk software tracks: 3ds Max, AutoCAD 2D, AutoCAD Civil 3D, Inventor, Navisworks, Revit Architecture, Revit MEP and Revit Structure.

Here's a brief history of the TD contest, from AUGI.  And here's a summary of last year's competition.

Once again, we had some cool prizes up for grabs.  Track winners walked away with a $100 USD Amazon voucher (a combination of high scores and fastest times determined the winner of each track) and the overall winner won a HP laptop and a free pass to AU2013.

Here's a shot of the contest rules board:


We spent several hours on the Monday, setting up 18 networked PC's in the exhibition hall, at the AUGI stand. We got off to a slow start when the exhibition hall manager decided that the PC's were facing the wrong way round. So we dismantled all 18 PC's, turned them around 180 degrees and wired them all back in again! (Which, of course, we were delighted to do!).


Now properly oriented, all 144 copies of  Autodesk software needed checking, then all the KS sample data sets had to be uploaded to each machine and checked again.  Big thanks to Joe Croser and Mike Johnson for their help in accomplishing this rather onerous task!

Here's how we looked when we were all nicely set up:


The competition ran over 2 days (1 x 3 hour slot on day one, then 1 x 2 hour and 1 x 3 hour slot on day two). Contestants had to answer 10 questions, using the 2013 version of each software title. Each session was limited to just 10 minutes, so people had to work fast!  Special thanks to awesome AUGI team members, Kristin, Bob, Michael, Donnia and Richard, for briefing users and making sure everything ran smoothly.

Here are the Top DAUG's in action:


By the end of the contest, we posted 336 results, with 0% failed tests. Considering the hotel web connectivity was, at times, positively creaking with the competition for bandwidth in the main hall, we are rather proud of the resilience demonstrated by the KS system.  Particularly when we saw some users juggling a beer in one hand, a mouse in the other and a bag of popcorn balanced somewhere in the middle! (Editor note: When you're writing technical specs for software, somehow you never quite anticipate all of the live scenarios your code might end up experiencing! :) ).

Throughout the competition, we posted a rolling list of the top 10 contestants for each track, on the AUGI big screen.



The Results

Congratulations to the following contestants, who won their respective software tracks:


And a special mention to the overall winner of AUGI Top DAUG 2012:

Fred Wismer


Analysis

So, let's take a detailed look at the results of this year's Top DAUG competition.

Overall

No. of Tests Completed:  336
Overall Average:  41% in 9 mins 26 secs
(NB the average score for 2011 was 53%).




Track 1 - 3ds Max


Track Winner: James Clarke (NB James is the Max track winner for the second year running).

Winning Score: 75% in 10 mins 0 secs

Top 10 Contestants:


No. Completed: 13
Group Average: 17% in 9 mins 34 secs


Track 2 - AutoCAD 2D

Track Winner: Glenn Sinclair
Winning Score: 100% in 8 mins 5 secs

Top 10 Contestants:


No. Completed: 113
Group Average: 37% in 9 mins 45 secs


Track 3 - AutoCAD Civil 3D

Track Winner: Fred Wismer
Winning Score: 100% in 7 mins 30 secs

Top 10 Contestants:


No. Completed: 46
Group Average: 61% in 8 mins 20 secs


Track 4 - Inventor

Track Winner: Tracy Chadwick
Winning Score: 60% in 10 mins 0 secs

Top 10 Contestants:


No. Completed: 16
Group Average: 22% in 9 mins 50 secs


Track 5 - Navisworks

Track Winner: James Austin
Winning Score: 80% in 10 mins 0 secs

Top 10 Contestants:


No. Completed: 19
Group Average: 42% in 9 mins 34 secs


Track 6 - Revit Architecture

Track Winner: Michael Patrick
Winning Score: 98% in 9 mins 30 secs

Top 10 Contestants:


No. Completed: 84
Group Average: 55% in 9 mins 20 secs


Track 7 - Revit MEP

Track Winner: Maxime Sanschagrin
Winning Score: 53% in 10 mins 0 secs

Top 10 Contestants:


No. Completed: 26
Group Average: 34% in 9 mins 50 secs


Track 8 - Revit Structure

Track Winner: Eric Bernier
Winning Score: 95% in 10 mins 0 secs

Top 10 Contestants:


No. Completed: 19
Group Average: 43% in 9 mins 32 secs


Popular Tracks

The most popular tracks, in order of completed tests, were as follows:

AutoCAD 2D - 113 results
Revit Architecture - 84 results
AutoCAD Civil 3D - 46 results
Revit MEP - 26 results
Navisworks - 19 results
Revit Structure - 19 results
Inventor - 16 results
3ds Max - 13 results


Range of scores

Interestingly, across the 8 tracks, we saw scores ranging from 0% to 100%. Here is a summary of both ends of the performance scale:

4 x 100% scores (2 x AutoCAD 2D, 2 x AutoCAD Civil 3D).
12 x 0% scores (1 x AutoCAD 2D, 4 x AutoCAD Civil 3D, 3 x Navisworks, 3 x Revit MEP, 1 x Revit Structure).


Honourable mentions

Along with our track winners, the following contestants deserve a special mention, for their performance in the competition:

Brian Mackey - yet another stellar performance from our 'Mr Consistent' competitor. Just pipped for 2nd place in the RST track, top 3 finish for Navisworks, plus a very good Revit Architecture score.

Brent McAnney - a winner in last year's contest, placed top 3 in both AutoCAD and Civil 3D tracks.

Glenn Sinclair - winner of the AutoCAD 2D track, top 10 for Revit Architecture and 2nd place overall in the contest.

Tracy Chadwick - winner of the Inventor track, top 5 finish for AutoCAD 2D, plus a very good Revit Architecture score.

Kate Morrical - another consistent performer, placing top 3 in both AutoCAD 2D and Revit Structure tracks.

Rebecca Frangipane - track winner from last year, top 5 finish in Revit Structure, plus a very good Revit Architecture score.

John Fout and Ben Downey for completing 4 tracks apiece.


Boys vs Girls

Finally, let's take a look at the demographic breakdown of the competition. Out of 336 contestants, 303 were male and 33 female. The average overall performance for each group breaks down like this:

Girls: 49% in 9 mins 18 secs
Boys: 41% in 9 mins 29 secs


So, that's Top DAUG done and dusted for another year. A thoroughly enjoyable 3 days at AU2012. 336 completed tests, across 8 popular Autodesk software applications. Once again, the overall standard was extremely high, with some outstanding individual performances from our track winners and top 10 contestants. And once again the Civil 3D track takes the overall prize!

Congratulations to all our winners. Thanks to the AUGI team for all their support, with a special mention for AUGI President, David Harrington and past-President, Mark Kiker. Lastly, a big thank you to everyone who took part in this year's contest.

R

2013 Library Updates

$
0
0
Happy New Year!

We've been busy over the holiday period, putting the finishing touches to a selection of new KS library updates.  These include:

Revit Architecture 2013 - Advanced
Part 1 of 2 sets of challenging questions, looking at more advanced aspects of applying RAC on projects. Written by Paul Aubin.

Revit Architecture 2013 - Occasional Users
An introductory set of questions, written by Chris Senior, with part-time users of the software in mind.

Inventor 2013 - Part 3 (Drawings)
Part 3 of 4 sets, written by John Evans. Part 1 covers Assemblies. Part 2 covers Part Modeling. Part 4 to follow in Feb/Mar covering Welding & Sheet Metal.

BIM Management
A general set of questions, edited by Evolve Consultancy, covering a variety of different aspects of BIM Management and Coordination, including IFC and COBie.

Plant 3D - Part 1 (Project Setup and Specifications)
A general level set of questions, written by Joel Harris, covering project setup, template creation and customization. Parts 2-5 to follow in the coming months, covering the following topics: Piping specs & catalogs, 3D modeling, Document generation and P&ID.

USACE BIM Process
A general set of questions about the official BIM process, as created by the US Army Corps of Engineers. This assessment comprises questions looking at both the Policy and Content of the USACE BIM protocols. Edited by a sub-committee of the USACE BIM team.

We are working on some new titles in the Spring months.  These include:
Inventor Part 4 - Welding & Sheet Metal
Plant 3D Part 2 - Piping Specifications & Catalogs
Revit Structure 2013 Advanced
Revit MEP 2013 Advanced
Civil 3D 2013 Advanced
ArchiCAD
Bentley ProjectWise
Synchro

R

KS & Global e-Training Links

$
0
0
KnowledgeSmart has been working with the team at Global e-Training, to map the results for our most popular test titles, to the corresponding modular training content, presented within the GeT Learning Management System.

These titles now have full links to learning enabled:


AutoCAD 2D fundamentals
AutoCAD 2D for occasional users
AutoCAD Civil 3D fundamentals (parts 1-4)
Navisworks fundamentals
Revit Architecture fundamentals
Revit Structure fundamentals
Revit MEP fundamentals (Mechanical)
Revit MEP fundamentals (Electrical)
Revit MEP fundamentals (Plumbing)

To follow in the Spring:

Inventor fundamentals (parts 1-4)
3ds Max fundamentals (parts 1-4)


Click here for a short video, which illustrates the user journey between testing and training.

R

Testing Revit Architecture Skills - A 3-Step Approach

$
0
0

Revit Architecture is the most popular test title in the KS library. No big surprise. So let's take a look at how an AEC firm might create an appropriate testing environment for Revit Architecture, catering for users of all levels of experience and abilities.

Firstly, it's important to establish that not all users need to become Revit 'experts'. Many people have a primary role which means they come into contact with Revit on an occasional basis, but no more than that. So, do they need to become adept in the finer points of massing, curtain walls and Family editing? Certainly not! For example, PM's, Project Architects, Practice Principals, and so on.  However, in many instances, these people can benefit from a degree of knowledge and familiarity with the technology, not least from a project planning or resourcing perspective.  So some basic knowledge of the tools can be beneficial.

For these individuals, we have created a 'Level 1' assessment, called 'Revit Architecture for occasional users'. As the name suggests, it is a fairly gentle, introductory level test, which looks at some basic concepts, including:  Files & File Formats, Navigation, Views & Sheets, Measuring, Exporting Data, Families and Element Selection.

The fact is, casual users of Revit software sometimes make silly mistakes, mainly through lack of understanding, thus creating work for more experienced users tasked with tidying up their mess. This is a common complaint from design firms and one which is easily avoided.  Simply put, no-one gets access to a license of Revit, without first achieving a minimum 'pass' mark on the Revit occasional test (level to be determined by individual firms). Don't give someone the keys to the car, without first establishing they have a basic understanding of how to drive.

For the primary modeling team, we have the 'Level 2', or 'Revit fundamentals' level material. This is a general level test, covering a wider range of topics, including: Basic Element Creation, Views & Sheets, Detailing, Keynoting & Annotation, Worksharing, Dimensions & Rules, Interoperability, Families & Parts, Scheduling, Coordinates & Orientation and Outputs.

This level of test material can be used to create reliable benchmark data for the firm, compared to industry average statistics. Performance 'quartiles' (see image below) are an effective way of targeting users with incremental productivity improvements over time, with the appropriate modular training workshops addressing highlighted skills gaps.




For users placing in upper quartile 3 or quartile 4, we have a variety of modules which are designed to address more advanced Revit concepts. The 'Level 3' material covers more process based scenarios and looks at the impact of using the software in a project environment. Topics such as Revit Families, Work flows, Project process and Worksharing are covered in greater detail. 

In addition to test modules about Revit software, we have written some general questions about BIM and BIM management. So firms can create assessments which cover technology and also the wider Building Information Modeling environment, within which the software is deployed.

By targeting users with material appropriate for their job function, and also their current level of ability, AEC firms can now create more meaningful benchmark data, plan a more focused training strategy and adopt a stepped approach to measuring Revit and BIM knowledge across their teams.

R

KS user group - April 11

$
0
0

It's that time of year again. We have scheduled the date for the next KS user group.

Here are the details:

Location:  Smeaton Room, Buro Happold, 17 Newman Street, London, W1T 1PD
Date:  Thursday 11 April
Start:  09.00
End:  14.00
Cost:  Free (Breakfast & lunch included)

We will be reviewing the latest KS tools in some detail, including a look at what’s coming up in the next 3-6 months. Plus a guest presenter or two.  Full agenda to follow, as the date gets closer. 

Places will be limited to 25, on a first come, first served basis.

Hope to see you there!

R

Learning Plateaus

$
0
0

At KnowledgeSmart, we spend a lot of time speaking to firms about performance improvement. So what happens when, in the pursuit of advanced learning, we experience a slow-down in our development? We hit what is sometimes referred to as a 'learning plateau'.

A learning plateau occurs when forward progress seems to have stopped while engaged in learning a new skill. These plateaus are normal and commonly experienced periodically when learning to play a musical instrument, speak a new language, or learn some other complex discipline.

Dutch author, Lodewijk van den Broek, describes hitting the plateau as, ‘the experience where you feel that no matter how hard you try, there is no progress in learning. And even though this is not entirely true, the feeling is very real’.

According to van den Broek, there are two major things happening here. The first is that you have progressed past the bend in the 'learning curve'. There are different stages in the learning curve:
  •  Beginner stage (0% to 60%)

The curve is very steep and you learn very fast.
  •  Intermediate stage (60% to 80%)

This is past the bend in the total curve. Learning speed declines and this phase takes longer than the beginner stage.
  •  Advanced stage (80% to 95%)

Learning speed declines even further and the length of the stage increases as again.
  •  Expert or Master stage (95% to 100%)

Learning speed drops to slow progress and the length of the stage stretches into eternity.

So once you hit Intermediate level, your progress slows down. And at the same time it gets increasingly difficult to measure that progress. These effects combined result in the feeling that you have hit a plateau in learning.

US authors, Brett & Kate McKay, describe a road map for overcoming Performance Plateaus. Back in the 1960’s, two psychologists, Paul Fitts and Michael Posner, set out to uncover why we plateau. They discovered that when we acquire a skill, we go through three stages.

The first stage of skill acquisition is called the cognitive phase. In this phase, we must concentrate intently on what we’re doing as we figure out strategies on how to accomplish the skill more efficiently and effectively. The cognitive phase is riddled with mistakes as we learn the ins and outs of our new pursuit.

The second phase of skill acquisition is the associative phase. During the associative phase, we make fewer mistakes. Consequently, we feel more comfortable with the skill and begin to concentrate less on what we’re doing.

The final stage is the autonomous phase, or what Joshua Foer, author of ‘Moonwalking with Einstein’, calls the “OK plateau.” We reach a skill level where we’re able to capably do the task without having to really think about it at all. Remember about how much you thought about what you were doing when you first got your driver’s license? Now driving is fairly automatic, like brushing your teeth.

People used to think that you couldn't break past these plateaus because a plateau represented the limit of your genetic ability. No amount of exertion or education would help you overcome this wall. But Fitts, Posner, and other psychologists discovered that with the right approach and a few attitude adjustments, all of us can bust through our plateaus and reach even higher.

How to Overcome Plateaus
Take risks. Growth comes when we stretch past our comfort zone. The big reason many people (especially high-achievers) plateau is because they don’t like to fail. Instead of taking on challenges that will help us grow, we stick with routines that we know we can successfully do. To protect our ego, we’d rather do the wrong things correctly, than do the right things wrongly. This aversion to risk is a recipe for plateauing.

Embrace your failure. To overcome your aversion to risk, you have to give yourself permission to fail and be mediocre. Instead of avoiding the things that are hardest for them, the greats of the world specifically focus on those things; they purposefully concentrate on the areas in which they make the most mistakes. This keeps them from getting stuck in the autonomous phase and propels their progress. So instead of seeing failure as a negative thing, think of your failures as steps to success. If you choose to learn from your failures, they can bring you closer to your goal.

Another reason we plateau is because everyone around us is telling us everything is fine. We often confide in people who tell us what we want to hear, not what we need to hear. For example, we finish a project and take it to somebody for some “constructive criticism,” when really we just want some positive affirmation on what we did. If you feel stuck in an area of your life, seek out mentors who won’t pull any punches and will give you the honest criticism you need to improve. Yes, your ego will get bruised, but that’s the price one must pay for personal and professional growth.

Practice deliberately. Fitts and Posner discovered three keys to breaking through your plateau: 
1) focus on technique, 2) stay goal oriented, and 3) and get immediate feedback on the performance. In other words, you need to practice deliberately to break through plateaus.

Get back to basics. Even when you’re advanced at something, delving back into the basics can actually give you fresh insights that help you progress even further.

Think long term. When we think short-term, we have a tendency to feel that plateaus are permanent. But when we take the big picture view of things, we start to see plateaus as temporary way-stations that we’ll eventually get past with a bit of hard work. Moreover, by thinking long term, we give ourselves more latitude to take risks and fail because we see that missteps are just momentary setbacks in the long journey of life.
To cultivate this attitude, reflect on a time where you felt you had reached the end of your development in some area, only to later bust through the plateau. If it was possible then, it’s possible now.

Psychology author, Russell A. Dewey, PhD, describes what happens with a typical learning curve.
With repetition of almost any motor task, learning occurs, and a person becomes more efficient or effective at carrying out a task. Progress in skill learning commonly follows an S-shaped curve, with some measure of skill on the Y axis and number of trials on the X-axis. Progress is slow at first, then a subject may experience a burst of learning that produces a rapid rise on the graph.



The S-shaped "learning curve" typical of complex learning

Why does an S-curve of growth eventually level off?
What people call a plateau may be a period of stability after a skill is learned as well as it can be learned. Most growth processes follow the same S-shaped curve as motor learning. In general, an S-shaped curve of growth levels off because stability is attained, a resourceneeded for growth is limited, or a ceiling of performance is reached. 

What phases of learning a complex skill cause the "S-curve" pattern?
The S-shaped learning curve is most obvious when someone learns a highly complex task. The initial part of the curve rises slowly as a person becomes familiar with basic components of a skill. The steep ascending phase occurs when there is enough experience with rudiments or simple components to start "putting it all together." Rapid progress follows until the skill "hits a ceiling" or stabilizes at a high level.

To put this in the context of AEC-related skills monitoring, KnowledgeSmart analyzed the results of several thousand test results, covering basic 2D CAD skills (using AutoCAD and MicroStation software).

The results were as follows:

0-1 years' experience:  53% in 87 mins

4-5 year's experience:  60% in 77 mins

8-10 years' experience:  61% in 79 mins

Many times, we hear people say, I don't need skills testing or training, because I've been using (XYZ software) for many years. Could it be that, even in the case of experienced users, a learning plateau might have been reached, and a fresh approach to learning could prove beneficial in helping to further improve performance?

R

BIM Show Live TopGun

$
0
0


Last week, we hosted the inaugural BIM TopGun skills contest, at BIM Show Live 2013.

In its third year, the conference was a huge success, with over 600 BIM-savvy souls networking, sharing and learning over two action-packed days at the Crowne Plaza in London's Westminster, just a stone's throw from Big Ben and the Houses of Parliament.

This year, KnowledgeSmart was approached by the show's organising committee to replicate the popular 'Top DAUG' competition format from Autodesk's annual user conference in Las Vegas.  Never one to shirk a challenge, we worked with the BSL committee to create a vendor-neutral, general BIM knowledge quiz, covering topics including BxP's, COBie, IFC data, file formats, what happens when you say 'BIM Model' whilst presenting on stage (see below) and other such exotic matters!


The overall quiz comprised 15 questions, plus a bonus question, to be completed in 10 minutes or less. (In fact, we generously added 3 extra minutes to allow for the in-house Wi-Fi connection, which was creaking at times!).

After a slightly tentative start, we soon had a steady stream of willing BIMMER's, all competing to be crowned the UK King (or Queen) of BIM!  As well as competing for bragging rights and the title of BSL 2013 BIM TopGun, the winner was also to be presented with a coveted BIMMY award, by the Conference Committee.  So, frankly, the stakes could not have been higher.  To add fuel to the fire, many prominent members of the world-renowned #UKBIMCrew were also in attendance, each one determined to crush his or her competitors into the dirt! (Don't let the slightly enigmatic smiles fool you, this was BIM WAR!).



Over the course of the two days, we posted 74 scores, ranging from 25% to 83%.  Here is a breakdown of the results:

Overall average score & time = 53% in 11mins 56 secs.

Average score & time for the boys = 53% in 11 mins 53 secs.

Average score & time for the girls = 53% in 12 mins 34 secs.




Here is a breakdown of the Top 10 Leader-Board, after the final results were tallied:


The Top 10 scores ranged from 83% to 71%.

And massive congratulations to our overall WINNER, Mr Dave Lee, from HDR's London office, who is now officially the UK's most prolific BIM Champion (probably).


Here is a piccie of Dave accepting his BIMMY, from BSL Chair, Rob Charlton.


It should be noted, for the record, that several #UKBIMCrew members were conspicuously absent from the list of TopGun participants, come the end.  We thought long and hard about publishing a 'Wall of Shame', but decided in the end that this might be a step too far! :)

Congratulations to everyone who took part and thanks for making it such a fun event.

R

Building Information Model... Model

$
0
0

Saying “BIM Model”, or “Building Information Model Model”, is a tautology. Extensive research by Professor James Vandezande, supported by a leading team of linguistics scientists at HOK, have proved beyond all reasonable doubt that, when said in conjunction, the phrase causes the heart of a fluffy kitten to stop!


This affliction is sometimes referred to as RAS syndrome (short for "redundant acronym syndrome syndrome"), also known as PNS syndrome ("PIN number syndrome syndrome", which expands to "personal identification number number syndrome syndrome") or RAP phrases ("redundant acronym phrase phrases"), referring to the use of one or more of the words that make up an acronym or initialism in conjunction with the abbreviated form, thus in effect repeating one or more words.

Instead, try saying “The model”, “The BIM” or any number of other harmless phrases.

R

KS New Release

$
0
0
The next release of the KS software hits the live site this month.  Here's a summary of new features and updates.

Cleaner KS login page

We have simplified and tidied up the dashboard landing page...




... and the dashboard / assessment login pages:



User data capture pages

We are keen to continue providing interesting and relevant user demographic data, for statistical and comparative analysis. A regular feature request is the ability to capture additional background information for candidates, so we added two new data information pages (one at the start and one at the end of a test session):



These additional pages are optional. KS admins can choose to display (or not display) the data capture pages, with a new dashboard setting.


The 5 user datafields can be presented on data capture page 1, with an optional/mandatory status, by selecting the relevant fields on the Settings > User Datafields page.


Improved test report

We have updated the format of the KS test report, making it easier to navigate to the relevant feedback sections, adding a new field for number of logins/logouts and a new 'Additional Information' section for candidate profiling.



Improved content management

We added a number of improvements to the Library section of the KS dashboard.  This includes adding the training tag export tool to the main KS Tests menu page:


Allowing admins to edit answer options for multiple choice / order list / pick list type questions:


We improved the KS content editing process by adding the green information panel to the 'Your questions' box and 'Your modules' box, making it easier to identify KS content.


We also included the ability to re-order questions inside KS modules and modules inside KS tests.


Improved searching & grouping options

We added a new search table to the Invites > History page, making it easier to filter the data on this page.


We added a new field for 'Test name' on the dynamic grouping table, in the Results > Data page.


Improved user records page performance

We have made KS user records fully editable, deletable and added simple pagination to accounts with extensive user lists. This has improved the performance of the users page, including the ability to export records from linked accounts.



Plus a number of smaller updates and bug fixes, including:

Improvements to test UI performance, including removing the 5-second page f'licker', when the KS software pings the server to register latest test data.
Update to question csv export, to include distractor answers for multiple choice / pick list question types.
Update to the naming convention for imported tests.
Uniform date format on the Results > Data page and related csv exports.
Update to error messaging for non-admins attempting to log in to a KS dashboard.
Update to error messaging for incorrect file uploads.
Update to error messaging for duplicate questions appearing in a test session.
Update to results searching & editing in our own KSadmin database.
Update to management of user records in our KSadmin database, including the ability to merge two user records.
Better handling of apostrophe's in KS user names, e.g. Rory O'Vance.
Bug fix to address amended KS account names not displaying correctly on invite mails.
Bug fix on administrator notification mail failure for new account setup.
Bug fix on user record deletion on the Move Users page.
Update to specification of KS Rackspace cloud server box.

We have a further software release scheduled for June, which completes the latest round of KS devs.

As always, our thanks to the KS user group for your feedback, suggestions and ideas for continual advancement and improvement of the KS system.

R

KS Help Notes_Invites 06_Self-Inviting to a KS Test

$
0
0

If you want to allow users to invite themselves to take a KS test, here is what you need to do.

Set up a new Full-admin profile for your KS account.  Go to the Accounts > Your Accounts page of your dashboard and select the 'Administrators' icon next to your test account.


Use the 'Add administrator' fields to add a new Full-admin to your account.


Use a general user name, i.e. XYZAdmin and a 'catchall' email address, such as training@xyzengineers, learning@xyzengineers, or similar.

Look out for the system mail with your new admin's login details.  Or, alternatively, re-set the password on your new admin profile.


Log in as the new administrator.  Now, hit the 'Change password' link, to create a new generic password, i.e. XYZ123.  These login details will be made available to all users, so should be easy to remember.


Last, change your new admin profile from Full-admin to Sub-admin status.  Don't forget, a Sub-admin user profile can only access a KS test from a browser.  It cannot be used to access your main KS dashboard.


Now that you have set up your new general Sub-admin profile, create some simple user instructions, so that all your users can log in and invite themselves to take a KS test.  Here is a brief guide to logging in to a test session from a browser.

R

KS Help Notes_Settings 08_Capturing Additional User Data

$
0
0

KS admins have the option of capturing additional user information, to gain a more rounded picture of a person’s background and experience.

Go to Settings > Test UI Options and Check the ‘Display user data capture pages’ box if you want to display the additional user information capture screens, at the start and end of a test session.


The first data capture page appears during the test login process, just before the start page.


You can present the 5 user datafields, including the option for these fields to be optional or mandatory, by checking the appropriate boxes on the Settings > User Datafields page.


You can also capture the following additional information:

-Primary Industry/Discipline
-Primary Role
-Country
-State (If USA selected)

Lastly, users have the option of providing a ‘self-rating’ of their knowledge of the topic, for which they are about to take a test.

Please rate yourself on your ability to use the software, for which you are about to take a skills assessment, on a scale of 1 to 5:

1 = Very basic knowledge; not enough to work confidently on a project
2 = Basic knowledge; can get by working on a project but could do better
3 = Good knowledge; can produce a good standard of work on projects
4 = Advanced knowledge; can teach the basics to others
5 = Expert knowledge; can perform and teach others at an advanced level

The second data capture page appears at the end of a test session, just before the results screen appears.


Here, you can capture the following additional information:

How many years have you been using BIM/CAD/Engineering software?

How often do you use BIM/CAD/Engineering software?
Regularly (daily)
Part time (weekly)
Occasionally (monthly or less)

What BIM/CAD/Engineering software do you regularly use?

How did you primarily learn to use BIM/CAD/Engineering software?
School/College/University
Formal training
On the job/self taught
Online/web based training

Where did you first learn to use BIM/CAD/Engineering software?

Plus an extra field for users to provide additional comments & feedback.

The information captured is presented at the bottom of each KS test report.


R

MottMac BIM TopGun

$
0
0


This week, we attended the annual BIM Forum, hosted by Mott MacDonald. This is our fifth forum, and we have seen the event grow steadily year on year. This year's event in Birmingham was the biggest yet, with nearly 200 delegates in attendance, from all over the UK and parts as far flung as Europe, the US, Singapore, India and the Middle East. Kudos to Gavin Skidmore and his team for all the hard work and preparation that goes into this conference, which is always a tremendously enjoyable affair.

This year, we hosted the inaugural MM BIM TopGun competition. Based on the same format as the contest from BIM Show Live, but with questions specifically written by MottMac staff, testing knowledge and understanding of the Mott MacDonald world of BIM.

Delegates had just 10 minutes to answer 18 knowledge based questions, covering a range of different BIM-flavoured topics, including MM BIM Champions, MM BIM vision, CoBie, Software use, File formats, BS1192, Clash detection and BIM process.

And the stakes were high, as co-sponsor DELL very kindly donated an XPS10 tablet and keyboard as the prize for the highest score.



Spread over the two days of the conference, we saw a steady stream of willing volunteers, all vying for the coveted title of MottMac BIM Top Gun. 


Over the course of the two days, we posted 51 scores, ranging from 22% to 83%.  Here is a breakdown of the results:


After a closely contested BIM battle, where just 9 points separated the Top 10 protagonists, here is a breakdown of the Leader-Board, after the final results were tallied:


And massive congratulations to our overall WINNER, Ian Besford, who is now officially MottMac's most prolific BIM Guru (probably). 


Here is a summary of the most commonly highlighted training issues, based on where most points were dropped:


So a thoroughly enjoyable couple of days. Special thanks to Gavin and Margaret for their hospitality. Next year's forum takes place in sunny Hawaii (well, no harm in registering our preference for location early! ;) ), so we're looking forward to MM BIM Top Gun Round 2.

R
Viewing all 112 articles
Browse latest View live