The Learning Circuits Blog |
- Gamification Blog Book Tour Underway!
- Measuring Learning: The Theory of Change
- Why Measurement Matters
Gamification Blog Book Tour Underway! Posted: 01 May 2012 08:02 AM PDT See below for the entire list of tour stops! The blog book tour has started as ASTD and I kick-off a 25 stop blog book tour for the ASTD co-published book The Gamification of Learning and Instruction: Game-based Methods and Strategies for Training and Education. The book is available at the ASTD Book Store. The tour includes stops at several ASTD chapter web sites including Philadelphia, New York and Houston, Texas. You are welcome to join the tour, no...you are urged and encouraged to join the tour. Leave a comment on this posting linking to your blog and it will "officially" become part of the tour. The tour currently includes some well known bloggers and some bloggers you really need to know but we want to expand it with your input, ideas and concepts related to Gamification. Here are all the tour stop dates. The blog book tour is a virtual tour so you can just follow along stop by stop. If you don't have the book yet, stop by the ASTD book store a pick up a copy. The Twitter hashtag for the tour is #gamiLI There is a Pinterest page for the tour. And a Facebook page for the tour, stop by and give it a LIKE:) Here are the tour stops, the day of the stop and a link to the stop are indicated below. You can follow along by going blog to blog and leaving a comment. If you stop at every stop and leave a comment, you will receive a free whitepaper "The First Five Steps to Gamification of Content, Curriculum and Courses." Also, on April 26th, Join Karl at the Houston ASTD Chapter's Webinar for a live chat and presentation by the author. Week One: Oops, already had a change in venue as my scheduling abilities appear to have been less than stellar, please see below for today's stop. April 16: Learning Circuits Blog. April 17: Gamification Facebook Page April 18: Jane Bozarth's Bozarthzone April 19: Kevin Kruse, Keven Kruse Blog. He is NY Times bestselling author of We: How to Increase Performance and Profits through Full Engagement April 20: Rich Mesch Performance Punctuated and he will be joined by Judy Unrein OneHundred Forty Words . Week Two: April 23: Clark Quinn Learnlets April 24: Karl Grieb ASTD Philadelphia Chapter April 25: Webinar Presentation for Houston ASTD Chapter "What Research Tells Us about Games, Gamification and Learning" Join the webinar. April 26: Debbie Richards Take an e-Learning Break And a live appearance by Karl at the NY ASTD Chapters combined SIG Meeting. If you are in NY, you may want to register and attend. April 27L Connie Malamed The eLearning Coach. Week Three: April 30: Amy Lui Abel New York ASTD Chapter Blog. May 1: Cammy Bean Learning Visions May 2: Tom Kuhlmann Word of Mouth May 3: Koreen Olbrich Learning in Tandem May 4: "Surprise Blog Appearance" Week Four: May 7: Mike Qaissaunee Frequently Asked Q May 8: Larry Hiner drlarryhiner May 9: Catherine Lombardozzi Learning Journal May 10: Brent Schlenker Elearning Development May 11: Zaid Ali Alsagoff Zaid Learn Week Five: May 14: Andrew Hughes Designing Digitally May 15: John Rice Educational Games Research May 16: Christy Tucker Experiencing E-Learning May 17: Justin Brusino ASTD Learning Circuits Blog (we come full circle to discuss the tour and the gamification concept) May 18: Karl Kapp Kapp Notes. The author provides reflections and lessons learned from the tour. So join us for this exciting tour and social media event to discuss the pros and cons of Gamification and what it means to learning and development professionals. And follow us on Twitter at #gamiLI. Gamification of Learning and Instruction Promote Your Page Too |
Measuring Learning: The Theory of Change Posted: 13 Apr 2012 08:28 AM PDT Last week, we kicked off Measurement Month with a discussion of why measurement matters. This week, we'll dig into connecting measurement and design to build the metrics in from the beginning. Throughout this post I'll use an example from a training program I worked with earlier in my career to illustrate how the measurement models can be put to work in training. The Alaska Example: Job Training for Teens In a my last job, I worked with Anchorage Youth Employment in Parks (YEP), a program hiring Anchorage teens to complete park improvement projects while learning job skills in trail building, construction, and habitat restoration. In partnership with a variety of community organizations, this program accomplished two key goals: Complete park improvements and train teens in in-demand job skills. As program partners secured public job training funds for this program, it was essential to ensure YEP was successfully developing employment-ready teens. After the program's pilot year, program partners worked with a University of Alaska sociologist to create a data model to measure the YEP program's effectiveness and continually improve the program. YEP used the Theory of Change model to guide the development of program metrics and evaluation. The Theory of Change model offers program designers a helpful guide for planning and measuring learning and development. The Theory of Change looks like this (from "How does the Theory of Change Process Work?"): Step One: Identify Goals Start your program design with the basics: Goals. Ask yourself and your stakeholders straightforward questions: What skills must participants have when they finished the program? What new responsibilities will they be qualified to take on next? To develop specific and measurable goals, conduct stakeholder interviews and focus groups. As you design your program goals, make sure you have a thorough understanding of your target audience to design initiatives that meet both their needs and the professional growth they wish to accomplish for themselves. In the YEP example, the goals for the students were to develop competency in several professional outdoor skills: Forest maintenance, trail building and streambank restoration. Participating teens were required to demonstrate technical competence in these professional skills as well as leadership and teamwork abilities. Step Two: Connect the Preconditions and Identifying the Changes Each program goal will have a chain of needed preconditions, which in turn require changes from the baseline data. The program design phase is an exploration of each step needed to achieve the preconditions necessary for the end goal. In the YEP example, program managers worked with experts in each of the subject areas (trail-building, forestry, and watershed restoration) to assess the competencies necessary to reach the desired end goals for the teens. We developed activities (both training programs and work projects) that would enable participating teens to explore different skills and apply them to real work environments. We made sure our summer-long program connected all of the dots between participating teens' beginning qualifications and our desired end goals. Step Three: Develop Indicators The next step is to create a data framework for the "before and after" states achieved through a training program. In the Anchorage example, this meant a case study of the target audience of our training program. We identified their likely background, education, and their desired outcomes from the program, and created scales to measure where an individual would stand within that range at the beginning of the program and the end. We included measurement of both teens' soft skills, like leadership, communication, and cooperation, and their specific job skills, in trail-building, forestry, and watershed restoration. We developed detailed assessments for teens to complete at the beginning and end of the program and a shorter, simpler questionnaire that they filled out on a weekly basis. Teens were asked for both self-assessment and for qualitative feedback, to which we later assigned data values. We also asked teens for feedback about the program itself. In addition to asking the teens to assess themselves and the program, we asked both crew leaders (team leaders) and the program supervisor to complete detailed assessments of the teens' skills and abilities at the beginning and end of the programs. All of these assessments were used to create our data framework. Finally, we created long-term goals and measurements for the teens' success after participation in the program, including employment in our target sectors, college attendance, career-readiness and also advancement in the program in returning years. We designed annual surveys to check in with program alumni and assess their long-term progress. Step Four: Write the Narrative The final step in the program design phase of the theory of change model involves creating a narrative: A model describing how your initiative will create change to achieve your program goals. This is more than just a pretty story: This is your opportunity to test your logic in plain English. When you create your narrative, this is a document you will share with your stakeholders to make sure that the steps you outline make sense - and that the data measures you identify connect accurately to the program goals. (In the next post, I'll share some sample narratives to help guide your program design process.) Step Five: Implement, Iterate, Improve This last step is where the really good stuff happens. Once you've designed a program, it's time to implement it, collect data, and use data to make adjustments and improvements. In the Youth Employment in Parks example, program managers used the Theory of Change model to provide both structure in the program design and metrics to measure its effectiveness. We found the data invaluable in enabling us to continually improve the program, by adding training in specific areas or reducing or removing unneeded components. We ultimately reduced the time spent in content study and achieved greater mastery of the content. This was a positive feedback loop, and achieved improved results continuously as participant feedback enabled us to iterate and improve. Next week we'll examine the process of building your measurement model. Further reading:
|
Posted: 10 Apr 2012 12:29 PM PDT For designers, developers and trainers, measurement is a weighted word. Employee performance is difficult to measure objectively - as is the success of a training program. Furthermore, for folks whose strength is creative communication, data analytics is a far cry from our comfort zone. Many instructional designers have also been turned off of analytics by seeing data used out of context or as the only measure of a complex initiative. As a result, measuring learning initiatives often falls into the pattern of measuring production - "we've provided X resources" or "we've distributed Y manuals" - because these are factors L&D can control. What really matters for an organization, of course, is not how many manuals L&D creates, but how many behaviors or outcomes L&D changes. With all the demands on our time, it's easy for L&D departments to be consumed by "putting out fires": Meeting short term needs, creating resources for squeaky wheels and solving immediate performance problems often come before taking time for strategic decision-making. So the need is greater than ever for learning and development to embrace data, measurement and analytics to enable us to target initiatives to efficiently meet business objectives, make the best possible use of our limited time and to demonstrate the efficacy of our work to organizational leadership. Tough Economies Put Pressure on the Numbers Immersing yourself in data-driven decision making will enable you and your team to make strategic decisions that meet broad organizational goals. Ultimately, the ability to harvest, discuss and process this information correlates to improved financial outcomes for large organizations: A recent study by KnowledgeAdvisors and Bassi Investments illustrated that a group of companies with high learning and development measurement acumen outperformed the Standard & Poor's 500 Index in terms of share price appreciation by more than 15 percent. - "Emerging Issues in Measurement", KnowledgeAdvisors Many L&D departments dread the measurement and ROI discussion. As a cost center (and not a direct revenue generator), L&D is always challenged to justify its connections to organizational success. If L&D professionals use good measurement programs to demonstrate improved productivity, changed behaviors and improved outcomes, they will be in a stronger position in the next budget cycle. This month, I'll outline a framework for measuring changed behavior, and share examples of this model's success in a social change program I worked with in Anchorage, Alaska. I will also showcase leading organizations' use of learning analytics to meet organizational goals. We'll continue next week with a discussion of connecting learning design to a data framework from the beginning. Further Reading on Measurement:
|
You are subscribed to email updates from The Learning Circuits Blog To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
No comments:
Post a Comment