Impact Led Innovation – 3 stories from the field

Impact led innovation is all about practical implementation. At Learning Tech Ireland you will get insights from three speakers sharing their experiences applying the concept in real world, learning technology, scenarios.

storiesfromthefield

Vivienne Ming (Named one of Inc. Magazine’s “Top 10 Women to Watch in Tech”)
Vivienne will share two experiences developing products, one in the classroom and one in the home, to solve real problems for real people. AI-driven systems for education, the importance of design thinking for improvement science.

Paidi O’Reilly UCC
Lessons learned from leading an Edtech Research Centre and introducing Design Thinking as a key organisational capability. With empathy-building activities at its core, the Centre specialises in leveraging open source software to develop innovative classroom engagement, assessment, and adaptive learning solutions.

Neil PeirceLearnovate
The aim of the H2020 DEVELOP project is to deliver an adaptive learning environment that dynamically tailors the exploration, comprehension, and planning of learning opportunities and career paths in medium and large companies. Neil will discuss the Design Thinking approach the team took to get a deep understanding of the user needs and how building on these insights DEVELOP seeks to provide effective solutions.

The day also includes; world expert keynotes, 2 workshops and free guides on implementing Impact Led Innovation. Places are filling up, do not miss this excellent opportunity to boost your team’s innovation capabilities. To book now click here

webPage-01

LX Conference: Learning in the Age of Experience

By Janet Benson

The LX Conference, held on 15-19 May, was the first online conference dedicated to Learning Experience (LX) Design and for me, as a Learning Experience Designer, this conference felt like the right fit for attendance, partly because of the theme of LX design but also to experience attendance at an online conference to see how the organisers made use of various technologies to host the event.

JB0

As a centre of excellence for innovation and research in learning technologies, Learnovate is committed to impactful and sustainable innovation in the learning design world, and hearing from/interacting with experts in the fields of user experience and learning experience design ensures that we are up-to-date on current and future learning design processes, including technologies and strategies.

One of my favourite talks of the conference, Joyce Seitzinger of Academic Tribe (https://academictribe.co/), and co-organiser of the conference, provided some practical advice for designing learning. She also shares some of my own frustrations with how corporate learning often does not reflect what is happening with learning outside of the corporate environment, particularly with regard to technology. Research is often way ahead of where corporate learning sits with regard to technology and effective learning theory. For example, continuing to inflict ‘death by PowerPoint’ on employees and use of lengthy read-and-understand training instead of making use of technology to put learners in the driving seat where possible.
Joyce reiterated Jess Knott’s and Perrin Rowland’s talks (other conference speakers) in discussing how we can orchestrate the elements for providing the opportunity for learning experiences to occur, while providing a nice process model for L&D design, as well as her own ‘tweaked’ version:

JB1

(Joyce Seitzinger, LX Conference, May 2017)

Using the double diamond diagram, Joyce highlighted how we need to find out more about the problems that we are trying to address, before attempting to address potential solutions:
JB2
(Joyce Seitzinger, LX Conference, May 2017)

Joyce provided links to other experts on the topic of LX Design, in particular Jeff Patton and his use of story-mapping, and the book ‘Game Storming’ by Dave Gray and Sunni Brown. ‘Game Storming’ (a book that I have since ordered for Learnovate) is a book that provides practical tools and techniques to encourage communication, to help to generate innovative ideas, and to encourage engagement and creativity, which are all parts of the LX design process.

Indi Young, a data scientist, researcher, author and co-founder of Adaptive Path (http://www.adaptivepath.com/) focused on designing with empathy and stressed the importance of focusing on customer needs in order to base information on real data rather than intuition.
This is crucial at Learnovate as we work to identify how we can help our clients by listening to their issues and spending time with them in order to promote the participatory design process (actively involve all stakeholders in the design process to help ensure the results meet the client needs and are usable).

Dr. Kemi Jona, founding director of the Lowell Institute School at Northwestern University, discussed a platform strategy for enabling next generation learning experiences, and introduced his talk by stating that technology has been used to sustain rather than transform the infrastructure and practices of learning, a bold opening statement.
Working in the area of learning design and technology, and having completed an online MSc. in Digital Education, I would have to disagree with Dr. Jona, as I feel that technology has transformed the learning landscape, although perhaps not to the extent that was expected or promised by advocates of learning tech. For example, learning design and course creation has had to evolve due to the nature of online learning; courses developed for face-to-face learning environments do not simply transfer to the online environment and therefore new skills and process have had to be created in order to provide effective learning experiences in the online space.
Dr. Jona stated that learning is lagging behind innovations in other sectors, which I would agree with in relation to corporate learning, and which links back to my earlier notes on Joyce Seitzinger’s discussion. He also highlighted the General Electric (GE) feedback app which they have used to replace the performance review process, something of great interest to us here at Learnovate (see below). The app is called PD@GE and involves peer to peer feedback to replace the annual or biannual performance review discussion, which can take place in a vacuum.
At Learnovate we have designed a mobile demonstrator app for competency assessment through peer-to-peer feedback in organisations to address the challenges that traditional performance management have, for example it being ad hoc and decontextualized. Learnovate’s Business Competencies project is nicely aligned with the issues as addressed by Dr. Jona.

JB3

Jolanda Morkel is a qualified architect and shared her experiences of developing a blended architecture studio in the Department of Architectural Technology at the Cape Peninsula University of Technology (CPUT) in Cape Town, South Africa.
Jolanda spoke about conversational learning and how her students co-designed the blended architecture course through conversation. She recognised that learning design is iterative and never complete, as she is always ‘observing, listening and reflecting’, while making learners part of the process of learning design and course creation encourages engagement and as Jolanda mentions, ‘you both learn through the mistakes’. This harks back to the concept of participatory design, and how ensuring that we involve stakeholders, learners, etc. in our learning design processes ensures a favourable outcome for our clients and for Learnovate.
For LX designers with UX/UI experience, this conference would not tell you anything you didn’t already know, apart from perhaps the concluding talk by Amy Burvall, which focused on promoting creativity generally (and had far too much content for me to discuss in a blog).
My key takeaways were how we might apply the concepts of user experience design to learning experience design, and to make use of those already established tools where possible and where relevant. The conference also provided me with some useful contacts and resources in the world of LX design that I intend to further research and perhaps discuss on a later blog.
I would say though that the conference organisers could have taken a leaf out of Jolanda Morkel’s book and made more use of the technologies out there to promote communication and collaboration between participants themselves and between participants and speakers.
Maybe next year?

JB4
(Jolanda Morkel, LX Conference, May 2017)

The xAPI story continues and it’s quite an exciting one!

By Mirjam Neelen

Back in 2015, Learnovate completed an initial xAPI project to explore the overall concept and all the things organisations would have to consider when implementing xAPI. We also did a proof of concept in which we implemented xAPI ourselves in a meaningful way for the users involved. This project resulted in:

  • An xAPI Checklist – which helps organisations to determine if they should or could consider xAPI,
  • a ‘Proof of Concept’,
  • a ‘How to’ guide
  • an invitation by LPI to present at the Learning Live conference in London.

Now, half way through 2017, there are still major questions in the minds of learning folks such as ‘What are xAPI’s key values? When to consider xAPI? What is the recommended design process when implementing xAPI?
Learnovate has been very much aware that thought leaders in the xAPI space, such as HT2 Labs, the Connections Forum, and of course the Tin Can xAPI lads and gals have made quite some progress with xAPI implementations and explorations, and we felt it was about time to create an overview of xAPI’s current state of the art.
We dug through academic research (we can’t help ourselves), piled up the case studies, and spoke to Ben Betts from HT2 Labs who was so nice to free up some time to share their experience with xAPI with us (by the way, they have also created some wonderful resources, such as ‘The Learning Technology Manager’s Guide to xAPI’ and – coming soon ‘Investigating Performance: Design and Outcomes with xAPI’).
After some careful analysis, we had to conclude that there are many good reasons to implement xAPI. The main reason being that, when you’re truly interested in offering learners (no matter if they’re employees or students) the best support to improve themselves and they’re using various systems and you need evidence to back your decisions up, then you need learning analytics to put all the pieces of the puzzle together. xAPI enables you to capture the required date from multiple sources, in a structured standardised way, using a single Learning Record Store (LRS).
The State of the Art report that we produced, discusses the ins and outs of the design process and the many things you need to consider when designing for xAPI. At a high level, the recommended design process looks as follows:

MIRJAM3

The complexity lies in identifying what you want to measure and why you need to measure it, dealing with legacy and planning for future systems, data analytics and visualisations, and all the privacy stuff, to just name a few. A fascinating but challenging nut to crack.
We also found some really interesting case studies which together paint quite a clear picture of why organisations go for xAPI and what its benefits are. We have explored seven of them in more depth; two in a K-12 context, two in a higher ed context, and three in a ‘learning for professionals’ context. In the last category, one of Learnovate’s partners, Intuition has kindly shared their experiences with us. They have implemented xAPI successfully and learnt some valuable lessons along the way.
From the case studies that we’ve collected and written up, a couple of trends arose that actually can be captured in one phrase:
Personalising and adapting learning experiences which are taking place in many different ways and places with the goal to increase learning effectiveness using learning analytics to drive evidence-informed decision-making. The example of an Emergency Medical Training using the Internet of Things (beacons) is showcasing this trend:

health

During a medical training simulation beacons on EMTs, firefighters, victims, equipment, and an ambulance record data. The data is sent to an xAPI
Learning Record Store (LRS) in the cloud where it is visualised in real time
The data can be analysed after the simulation to support performance improvement in order to improve patient outcomes in medical emergencies.

Although, like in 2015, we recognised that expanding xAPI beyond a single use case is a tough cookie, we must admit our eyes do sparkle a bit because of the fact that our case studies (and we’re sure there are tons more out there) not only show the potential of xAPI but also the drive in the learning and education arena. To be continued!

In case you’re up for some additional xAPI exploration, please contact us at info@learnovatecentre.org

Why Learning Professionals1 Should Look Past the End of Their Own Noses

By Mirjam Neelen

I have previously blogged about the learning profession in a corporate context (see here). In that blog (which focused on Learning Designers), I mentioned a couple of issues that I believe the profession really needs to fix. In short, I believe that Learning Professionals must:

  1. become more professional (titles and names are all over the place and many ‘Learning Professionals’ aren’t learning experts; that is, they don’t have any background in or knowledge of the learning sciences),
  2. respond to reality (Learning Professionals are dealing with a changing labour market with jobs that are increasingly about non-routine/non-recurrent skills), and
  3. understand the value of learning technology (when to use or not use it and how to use it most effectively).

In this blog, I focus on the second issue in the broader context of Learning Professionals. The workplace reality is that we as Learning Professionals are dealing with certain challenges that we need to respond to. Clark Quinn has written a couple of blogs related to this topic. For example, in his blog on the need for organisations to address change, he emphasises the need for Learning Professionals to use performance data to measure impact and facilitating a shift from ‘the training bubble’ to a learning culture, for example through upskilling managers to be coaches. In addition, he says that we need to support employees to focus on self-directed learning (see also our blog on SDL) and ensuring resources that can serve as performance support tools, as well as facilitating social learning through technology. Quinn is absolutely right.

In his blog A Field of Dreams Industry, Quinn confronts us Learning Professionals with our ‘dreamy’ approach. We do what we do based on faith. We take orders and do things without properly measuring results. Quinn hits the nail right on the head again. When designing learning experiences and/or performance support tools for employees, or support workplace learning in another way, it’s critical for us to show that these approaches have actually had an impact on performance. So, what does that mean?

Performance in this context doesn’t just refer to individual performance but more importantly, also to organisational performance. For individual employees, performance refers to observable employee behaviours and execution of job duties and responsibilities (e.g., outcomes) and generally organisational performance refers to financial performance such as stock returns though later in this blog we’ll see that it might be worth looking at it from a different, operational perspective.

When considering effective measurement of performance, Learning Professionals need to understand the relationship between human capital2 and organisational performance.

humancapital

The relationship between human capital and organisational performance

Russel Crook and colleagues conducted a meta-analysis to shed some light on this relationship. Crook et al. point out that, although we generally assume that investing in human capital supports both positive individual-level and organisation-level performance outcomes, there are also various studies that report that there’s NO significant positive relationship between the two. Therefore, the researchers conducted a meta-analysis to figure out why there are such contradictory results.

To this end, they compared 66 studies comparing three different moderators:

  1. Path dependence – They determined whether the studies were cross-sectional versus longitudinal. The idea here is that truly unique and valuable skills most likely develop over time and hence can only be discovered through longitudinal studies.
  2. Organisation-specific versus general human capital – Based on the assumption that if employees have valuable but more generic skills, they can move on to a competitor easily, while an employee with organisation-specific expertise adds value for an organisation because these employees are more likely to a) make decisions that are in harmony with the organisation’s unique strategy, organisational context, and competitive environment and b) stay at the organisation, as they can’t easily transfer to another organisation and therefore probably feel more ‘attached’ to (or stuck with ☺) their organisation. They just can’t quit that easily.
  3. Operational versus global organisational performance measures – The authors explain that sometimes, when certain employees create profits, it might not directly show in organisational performance because it’s likely that influential stakeholders such as the employee’s manager use their own power to secure potential profits for themselves (organisation-speak for ‘get a raise’). This means that the result might not directly show in global organisational performance measures. Instead, performance measurements are more likely to show in operational performance measures, such as customer satisfaction or innovation.

How the results help Learning Professionals to look past the end of their own noses

What did Crook and his colleagues find? Overall, the results suggest that human capital is strongly related to organisational performance. However, the idea that this link can only be identified over time (e.g., longitudinally) wasn’t confirmed. This raises the question of whether perhaps organisational success in itself helps attract and retain employees instead of there being just a ‘simple’ one-way relationship between human capital and organisational success.

Tae-Youn Park and Jason Shaw (2013) state that the literature offers dozens of established connections for organisational performance, such as location, strategy, technology, organisational processes, physical resources, and unique products and services, which all contribute to organisational success in one way or another. For the Learning Professional it’s important to be aware of this bigger picture in order to figure out what other elements (apart from human capital) influence organisational performance.

The second finding relates to this bigger picture. It suggests that the relationship between human capital and organisational performance is stronger when human capital is organisation-specific rather than general. Organisations need to find a way to retain their people because it takes an employee quite a while to develop organisation-specific knowledge and skills. Put simply, for employees to be motivated to improve their performance and stay in their job takes more than ‘just’ learning and performance improvement (also, see for example this blog by me). It’s necessary for Learning Professionals to investigate organisational culture and build relationships with employees at all levels in the organisation. That way, it will be possible to look beneath the surface and understand employee behaviours and problems at a more profound level.

Why is all this so critical? It goes back to Quinn’s point that we should move away from being order-takers. It’s absolutely vital to look beyond our own little world so that we’ll be able to first, validate the business or performance problems as stated by, for example a stakeholder. Only when we have validated that the suggested problem is the actual problem, we can determine if a learning experience and/or performance support tool is the best solution to solve the problem. The need for these steps in the learning & development process can’t be overstated. It will be largely detrimental for individual employees and organisations alike if we design experiences or solutions for problems that don’t exist (no need to measure there! Just don’t design them in the first place!) or if we design a wrong solution to an existing problem (this is where the measuring comes in). It might sound like I’m stating the obvious, but I’d be willing to bet good money that all of us Learning Professionals have encountered examples of both. Any bettors out there?

Crook et al’s last finding suggests that operational performance measures indeed correlate more strongly with human capital than global organisational performance measures. So, when including operational performance measures, it’s easier to detect impact. This finding strongly suggests that Learning Professionals need to find the right metrics to measure impact and prove value.

Pulling it all together, it’s about Learning Professionals to a) become effective gatekeepers (prioritise customers and learning and performance challenges), b) to understand the relationship between human capital and organisational performance in order to deliver the most impact to organisational performance. Which then c) needs to be proved through appropriate measurement.

Need I say more?

1 Learning professional in this context refers to any role that deals with any kind of learning experiences and/or performance support mechanisms in an organisational context.
2 Human capital refers to the concept that people possess skills, experience, and knowledge and therefore have economic value to organisations because they enhance productivity (Ramlall, 2004).
References
Crook, T. R., Todd, S. Y., Combs, J. G., Woehr, D. J., & Ketchen, D. J., (2011). Does human capital matter? A meta-analysis of the relationship between human capital and firm. Journal of Applied Psychology, 96, 443-456. DOI: 10.1037/a0022147
Park, T. J., & Shaw, J. D., (2013). Turnover rates and organizational performance: A meta-analysis. Journal of Applied Psychology, 98, 268-309. DOI: 10.1037/a0030723
Quinn, C., (2017, 8 March) [Blog post]. A ‘field of dreams’ industry. Retrieved from https://blog.learnlets.com/2017/03/field-dreams-industry/
Quinn, C., (2017, March 1). The change is here [Blog post]. Retrieved from https://blog.learnlets.com/2017/03/the-change-is-here/
Ramlall, S., (2004). A review of employee motivation theories and their implications for employee retention within organizations. The Journal of American Academy of Business, Cambridge, 52-63. Retrieved from http://s3.amazonaws.com/academia.edu.documents/40084558/Review_of_Employee_Motivation_Theories_-_JOurnal_of_Aerican_Academy_of_Business.pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=1493301067&Signature=XkF%2FhX9W6Iv5D6eUOGU4pEekc4k%3D&response-content-disposition=inline%3B%20filename%3DReview_of_Employee_Motivation_Theories_-.pdf

LT17 – Learning to Learn, How to Get Better

By Mirjam Neelen
As always, this year’s Learning Technologies conference covered a wide range of topics, such as emerging technologies, learning models and strategies, engagement, social learning, the science of learning, and of course aligning learning to the business. It’s clear that a strong L&D professional has access to a creative, diverse tool box and knows how to make use of it effectively.
What struck me as always is the gap between the “visionary” talks, for example Thimon de Jong’s keynote on living and learning in the connected society, and the experiences that both conference attendees and presenters share about what actually happens in organisations. Don’t get me wrong, these are still really interesting examples of projects or strategies that people have designed and implemented and they’re usually tackling very complex problems. For example, Sharon Claffey Kaliouby’s session on compliance training, which I chaired, clearly brought to the surface that it’s a sensitive balancing act to deliver an effective learning experience that truly impacts behaviour, while at the same time protecting employees from punitive approaches by organisations.
The various sessions on aligning learning and performance strategies to the business (Charles Jennings on 70:20:10, Laura Overton with “How L&D can work smarter for greater impact”, Tobias Kiefer with “The future of the L&D department”, and so the list goes on) shows that L&D still struggles massively to drive that so needed change. It’s not “just” the need for L&D to work closer with the business, it’s also the need for supporting employees in increasing their performance through “pull” instead of “push” models. One example is working out loud (WOL) (John Stepper), which is a way of networking in which you invest in relationships and find ways to make your work visible and frame it as a contribution. Working out loud circles are small peer support groups in which you work towards a goal. In addition to working out loud, there is also self-directed learning (Stella Collins), and personal knowledge mastery (PKM) (Harold Jarche). These three topics are highly integrated. For both PKM and WOL, you need to be quite a strong self-directed learner. I didn’t attend Jarche’s session unfortunately as it was running in parallel with Stella Collins’ one on self-directed learning and I chose to go there. Jarche’s illustration, which I found on SlideShare on how to practice PKM (see figure 1) clearly shows that employees need quite some skills to be able to do PKM effectively.

PKM

Figure 1. Jarche’s How to Practice PKM
Collins explained various interesting facts on neuroscience and learning, for example on the importance of sleep on learning, however, I was a bit disappointed because there was nothing tangible in the sense of “this is what I’m going to try to support self-directed learning in the workplace” and from my perspective, self-directed learning is critical. The only takeaway I got from her session is that we should stop relying on managers because we cannot expect each and every manager to be strong at supporting learning within their team and it’s more effective to face reality than fight it. Stepper’s session on working out loud was very appealing to me, in the way that it sounds quite “easy” to get started with it and it makes a lot of sense to do so because employees need to take control over their own learning to be effective, adaptive and stay competitive. Stepper explains that there’s a set of skills and behaviours that employees can learn in order to work out loud: build relationships, show generosity, make work visible, discover with purpose, and having a growth mindset (note that this has been under debate recently and Dweck has been defending her research here). The process starts simple, with recognition and appreciation and then ideally it slowly builds up to contributions that help others and perhaps at some point, you’ll receive something in return. Stepper calls the model “guided mastery” and he explains it as small steps, practice over time with feedback and peer support. As a “working out louder” you need to ask yourself three questions. First, what am I trying to accomplish? Next, who is related to my goal and finally how can I contribute to them to deepen the relationship? I think this approach can be effective in many different contexts. It’s not rocket science but it’s about finding ways to get started and maintain it and understanding what impacts people’s willingness to do it, such as intrinsic motivation, autonomy, mastery, and purpose.
My favorite sessions were the keynote by Thimon de Jong and Will Thalheimer’s session on spaced learning (and microlearning and subscription learning, as you can see in his slides here). De Jong was just plain funny and had fascinating stuff to share. He talked about how the data that is out there “knows us” and what that implies. For example, a tool such as Crystal Knows trawls through and interprets the things we share on the web and creates a scary accurate personality profile out of it. According to De Jong, this is what we want. We want our data to be used. To illustrate this, he talks about some research that Vodafone conducted in which their customers expressed that they want Vodafone to use their data more in order to get a more customised experience. And we’re willing to sacrifice our privacy as long as we get something in return. He also talks about a trust transition, meaning that we no longer rely on institutions for reliable information, we Google it, hence moving from institutional to personal. I find it intriguing that De Jong didn’t seem to worry about this transition. It was also mentioned by Julia Shaw in her strong talk on memory hacking (on how easy it is to make people believe a fake memory as being real through imagination). She stated that it becomes less important to remember facts because they’re so easy to find on the web. I could write a separate blog on this topic (and probably multiple) because I think the message that this is not a problem is a concern. It comes down to ideology and even reality (as in “what people do and believe” – Google it and you’ll be fine!) not being aligned with evidence from science. The statement that we don’t need knowledge as much anymore and that we need to focus on more generic skills is very popular at the moment (also see my blog with Paul Kirschner on 21st century skills). To put it very simply, if you don’t have sufficient knowledge and vocabulary to follow news or any type of information, then you’ll fail to interpret this information correctly. You could argue then that you can critically analyse this information using Google but there’s no such thing as being able to “think critically” or “problem-solve” without proper domain knowledge. Of course you can look up things, compare and contrast, analyse, etcetera if you’re not sure of something but the point is you’ll be more unaware of what you’re unaware of if you don’t have enough domain knowledge and hence you might not be aware that you need to look something up etcetera. This is exactly why collaboration is so important in organisations to solve complex problems. You can benefit from each other’s knowledge as people with different domain knowledge view the problem through a different lens. I’ll leave it at that (for now ☺).
Thalheimer’s session on spaced learning is close to my heart because although I support the notion that “work is learning and learning is work” I also feel that we need to be careful to truly understand how employees learn most effectively. We also need to increase employees’ awareness of effective strategies in order to be able to support them in managing their own learning. Thalheimer explained that spacing is one of the most reliable findings in the learning research but unfortunately it’s one of the least utilised methods in the workplace learning field. In short, repetition works, retrieval practice with feedback works, and spacing works. What does the latter mean? I’ll use one of Thalheimer’s examples to explain. Read the scenario on the left hand side and then try to decide for yourself what the correct answer is on the right hand side.
99Foods
The correct answer is A. Research has clearly showed over and over again that retention is better when we a) leave “non-learning” space in between practice sessions (how long exactly is not so clear) and b) that we need to apply “looped” learning, meaning that we need to repeatedly come back to previously learned concepts (gluten free foods, in this case). It might feel counterintuitive but the point is although it will take more effort short term, it will increase retention long term.
One of the biggest strengths of Thalheimer’s sessions was that it’s applicable in our jobs. Right away. And this aligns with the feedback that I heard over and over again from attendees: “I want a takeaway. I want to walk away from a session with something I can try, something I can start using.” Although we appreciate inspiring talks, in the end we want to get better at what we do. After all, we’re passionate about learning, right?