Impact Led Innovation – 3 stories from the field

Impact led innovation is all about practical implementation. At Learning Tech Ireland you will get insights from three speakers sharing their experiences applying the concept in real world, learning technology, scenarios.


Vivienne Ming (Named one of Inc. Magazine’s “Top 10 Women to Watch in Tech”)
Vivienne will share two experiences developing products, one in the classroom and one in the home, to solve real problems for real people. AI-driven systems for education, the importance of design thinking for improvement science.

Paidi O’Reilly UCC
Lessons learned from leading an Edtech Research Centre and introducing Design Thinking as a key organisational capability. With empathy-building activities at its core, the Centre specialises in leveraging open source software to develop innovative classroom engagement, assessment, and adaptive learning solutions.

Neil PeirceLearnovate
The aim of the H2020 DEVELOP project is to deliver an adaptive learning environment that dynamically tailors the exploration, comprehension, and planning of learning opportunities and career paths in medium and large companies. Neil will discuss the Design Thinking approach the team took to get a deep understanding of the user needs and how building on these insights DEVELOP seeks to provide effective solutions.

The day also includes; world expert keynotes, 2 workshops and free guides on implementing Impact Led Innovation. Places are filling up, do not miss this excellent opportunity to boost your team’s innovation capabilities. To book now click here


LX Conference: Learning in the Age of Experience

By Janet Benson

The LX Conference, held on 15-19 May, was the first online conference dedicated to Learning Experience (LX) Design and for me, as a Learning Experience Designer, this conference felt like the right fit for attendance, partly because of the theme of LX design but also to experience attendance at an online conference to see how the organisers made use of various technologies to host the event.


As a centre of excellence for innovation and research in learning technologies, Learnovate is committed to impactful and sustainable innovation in the learning design world, and hearing from/interacting with experts in the fields of user experience and learning experience design ensures that we are up-to-date on current and future learning design processes, including technologies and strategies.

One of my favourite talks of the conference, Joyce Seitzinger of Academic Tribe (, and co-organiser of the conference, provided some practical advice for designing learning. She also shares some of my own frustrations with how corporate learning often does not reflect what is happening with learning outside of the corporate environment, particularly with regard to technology. Research is often way ahead of where corporate learning sits with regard to technology and effective learning theory. For example, continuing to inflict ‘death by PowerPoint’ on employees and use of lengthy read-and-understand training instead of making use of technology to put learners in the driving seat where possible.
Joyce reiterated Jess Knott’s and Perrin Rowland’s talks (other conference speakers) in discussing how we can orchestrate the elements for providing the opportunity for learning experiences to occur, while providing a nice process model for L&D design, as well as her own ‘tweaked’ version:


(Joyce Seitzinger, LX Conference, May 2017)

Using the double diamond diagram, Joyce highlighted how we need to find out more about the problems that we are trying to address, before attempting to address potential solutions:
(Joyce Seitzinger, LX Conference, May 2017)

Joyce provided links to other experts on the topic of LX Design, in particular Jeff Patton and his use of story-mapping, and the book ‘Game Storming’ by Dave Gray and Sunni Brown. ‘Game Storming’ (a book that I have since ordered for Learnovate) is a book that provides practical tools and techniques to encourage communication, to help to generate innovative ideas, and to encourage engagement and creativity, which are all parts of the LX design process.

Indi Young, a data scientist, researcher, author and co-founder of Adaptive Path ( focused on designing with empathy and stressed the importance of focusing on customer needs in order to base information on real data rather than intuition.
This is crucial at Learnovate as we work to identify how we can help our clients by listening to their issues and spending time with them in order to promote the participatory design process (actively involve all stakeholders in the design process to help ensure the results meet the client needs and are usable).

Dr. Kemi Jona, founding director of the Lowell Institute School at Northwestern University, discussed a platform strategy for enabling next generation learning experiences, and introduced his talk by stating that technology has been used to sustain rather than transform the infrastructure and practices of learning, a bold opening statement.
Working in the area of learning design and technology, and having completed an online MSc. in Digital Education, I would have to disagree with Dr. Jona, as I feel that technology has transformed the learning landscape, although perhaps not to the extent that was expected or promised by advocates of learning tech. For example, learning design and course creation has had to evolve due to the nature of online learning; courses developed for face-to-face learning environments do not simply transfer to the online environment and therefore new skills and process have had to be created in order to provide effective learning experiences in the online space.
Dr. Jona stated that learning is lagging behind innovations in other sectors, which I would agree with in relation to corporate learning, and which links back to my earlier notes on Joyce Seitzinger’s discussion. He also highlighted the General Electric (GE) feedback app which they have used to replace the performance review process, something of great interest to us here at Learnovate (see below). The app is called PD@GE and involves peer to peer feedback to replace the annual or biannual performance review discussion, which can take place in a vacuum.
At Learnovate we have designed a mobile demonstrator app for competency assessment through peer-to-peer feedback in organisations to address the challenges that traditional performance management have, for example it being ad hoc and decontextualized. Learnovate’s Business Competencies project is nicely aligned with the issues as addressed by Dr. Jona.


Jolanda Morkel is a qualified architect and shared her experiences of developing a blended architecture studio in the Department of Architectural Technology at the Cape Peninsula University of Technology (CPUT) in Cape Town, South Africa.
Jolanda spoke about conversational learning and how her students co-designed the blended architecture course through conversation. She recognised that learning design is iterative and never complete, as she is always ‘observing, listening and reflecting’, while making learners part of the process of learning design and course creation encourages engagement and as Jolanda mentions, ‘you both learn through the mistakes’. This harks back to the concept of participatory design, and how ensuring that we involve stakeholders, learners, etc. in our learning design processes ensures a favourable outcome for our clients and for Learnovate.
For LX designers with UX/UI experience, this conference would not tell you anything you didn’t already know, apart from perhaps the concluding talk by Amy Burvall, which focused on promoting creativity generally (and had far too much content for me to discuss in a blog).
My key takeaways were how we might apply the concepts of user experience design to learning experience design, and to make use of those already established tools where possible and where relevant. The conference also provided me with some useful contacts and resources in the world of LX design that I intend to further research and perhaps discuss on a later blog.
I would say though that the conference organisers could have taken a leaf out of Jolanda Morkel’s book and made more use of the technologies out there to promote communication and collaboration between participants themselves and between participants and speakers.
Maybe next year?

(Jolanda Morkel, LX Conference, May 2017)

You think you have a great idea! But what would your mom think?


Most innovative ideas start with a spark, an idea that will transform learners, performance and the business. In many cases, friends (your Mom!) and potential customers are asked if they think it is a great idea? Next, with positive feedback, a product/service development project can kick-off. Later everyone is surprised when in the vast majority of cases the final result is not successful.

Rob Fitzpatrick, author of The Mom Test, is an expert in ‘how to talk to customers when everyone is lying to you’. Join Rob at Learning Tech Ireland 2017 where he will share his insights on how to develop innovation ideas using proven techniques. ,“Learning from customers is critical for building new products, but customer feedback is notoriously unreliable, especially pre-launch when it matters most. We’ll look at how to separate the fluffy compliments from the real data (and buying signals), ensuring you get more value out of the time you spend learning from and selling to your customers.”

To book now click here


For more information on the event, speakers, and to book visit our web page

If you have any questions relating to the event, please do not hesitate to contact me or any of the team here at Learnovate. Look forward to seeing you on the 28th!


Peter Gillis
Learnovate Centre

The xAPI story continues and it’s quite an exciting one!

By Mirjam Neelen

Back in 2015, Learnovate completed an initial xAPI project to explore the overall concept and all the things organisations would have to consider when implementing xAPI. We also did a proof of concept in which we implemented xAPI ourselves in a meaningful way for the users involved. This project resulted in:

  • An xAPI Checklist – which helps organisations to determine if they should or could consider xAPI,
  • a ‘Proof of Concept’,
  • a ‘How to’ guide
  • an invitation by LPI to present at the Learning Live conference in London.

Now, half way through 2017, there are still major questions in the minds of learning folks such as ‘What are xAPI’s key values? When to consider xAPI? What is the recommended design process when implementing xAPI?
Learnovate has been very much aware that thought leaders in the xAPI space, such as HT2 Labs, the Connections Forum, and of course the Tin Can xAPI lads and gals have made quite some progress with xAPI implementations and explorations, and we felt it was about time to create an overview of xAPI’s current state of the art.
We dug through academic research (we can’t help ourselves), piled up the case studies, and spoke to Ben Betts from HT2 Labs who was so nice to free up some time to share their experience with xAPI with us (by the way, they have also created some wonderful resources, such as ‘The Learning Technology Manager’s Guide to xAPI’ and – coming soon ‘Investigating Performance: Design and Outcomes with xAPI’).
After some careful analysis, we had to conclude that there are many good reasons to implement xAPI. The main reason being that, when you’re truly interested in offering learners (no matter if they’re employees or students) the best support to improve themselves and they’re using various systems and you need evidence to back your decisions up, then you need learning analytics to put all the pieces of the puzzle together. xAPI enables you to capture the required date from multiple sources, in a structured standardised way, using a single Learning Record Store (LRS).
The State of the Art report that we produced, discusses the ins and outs of the design process and the many things you need to consider when designing for xAPI. At a high level, the recommended design process looks as follows:


The complexity lies in identifying what you want to measure and why you need to measure it, dealing with legacy and planning for future systems, data analytics and visualisations, and all the privacy stuff, to just name a few. A fascinating but challenging nut to crack.
We also found some really interesting case studies which together paint quite a clear picture of why organisations go for xAPI and what its benefits are. We have explored seven of them in more depth; two in a K-12 context, two in a higher ed context, and three in a ‘learning for professionals’ context. In the last category, one of Learnovate’s partners, Intuition has kindly shared their experiences with us. They have implemented xAPI successfully and learnt some valuable lessons along the way.
From the case studies that we’ve collected and written up, a couple of trends arose that actually can be captured in one phrase:
Personalising and adapting learning experiences which are taking place in many different ways and places with the goal to increase learning effectiveness using learning analytics to drive evidence-informed decision-making. The example of an Emergency Medical Training using the Internet of Things (beacons) is showcasing this trend:


During a medical training simulation beacons on EMTs, firefighters, victims, equipment, and an ambulance record data. The data is sent to an xAPI
Learning Record Store (LRS) in the cloud where it is visualised in real time
The data can be analysed after the simulation to support performance improvement in order to improve patient outcomes in medical emergencies.

Although, like in 2015, we recognised that expanding xAPI beyond a single use case is a tough cookie, we must admit our eyes do sparkle a bit because of the fact that our case studies (and we’re sure there are tons more out there) not only show the potential of xAPI but also the drive in the learning and education arena. To be continued!

In case you’re up for some additional xAPI exploration, please contact us at

Why Learning Professionals1 Should Look Past the End of Their Own Noses

By Mirjam Neelen

I have previously blogged about the learning profession in a corporate context (see here). In that blog (which focused on Learning Designers), I mentioned a couple of issues that I believe the profession really needs to fix. In short, I believe that Learning Professionals must:

  1. become more professional (titles and names are all over the place and many ‘Learning Professionals’ aren’t learning experts; that is, they don’t have any background in or knowledge of the learning sciences),
  2. respond to reality (Learning Professionals are dealing with a changing labour market with jobs that are increasingly about non-routine/non-recurrent skills), and
  3. understand the value of learning technology (when to use or not use it and how to use it most effectively).

In this blog, I focus on the second issue in the broader context of Learning Professionals. The workplace reality is that we as Learning Professionals are dealing with certain challenges that we need to respond to. Clark Quinn has written a couple of blogs related to this topic. For example, in his blog on the need for organisations to address change, he emphasises the need for Learning Professionals to use performance data to measure impact and facilitating a shift from ‘the training bubble’ to a learning culture, for example through upskilling managers to be coaches. In addition, he says that we need to support employees to focus on self-directed learning (see also our blog on SDL) and ensuring resources that can serve as performance support tools, as well as facilitating social learning through technology. Quinn is absolutely right.

In his blog A Field of Dreams Industry, Quinn confronts us Learning Professionals with our ‘dreamy’ approach. We do what we do based on faith. We take orders and do things without properly measuring results. Quinn hits the nail right on the head again. When designing learning experiences and/or performance support tools for employees, or support workplace learning in another way, it’s critical for us to show that these approaches have actually had an impact on performance. So, what does that mean?

Performance in this context doesn’t just refer to individual performance but more importantly, also to organisational performance. For individual employees, performance refers to observable employee behaviours and execution of job duties and responsibilities (e.g., outcomes) and generally organisational performance refers to financial performance such as stock returns though later in this blog we’ll see that it might be worth looking at it from a different, operational perspective.

When considering effective measurement of performance, Learning Professionals need to understand the relationship between human capital2 and organisational performance.


The relationship between human capital and organisational performance

Russel Crook and colleagues conducted a meta-analysis to shed some light on this relationship. Crook et al. point out that, although we generally assume that investing in human capital supports both positive individual-level and organisation-level performance outcomes, there are also various studies that report that there’s NO significant positive relationship between the two. Therefore, the researchers conducted a meta-analysis to figure out why there are such contradictory results.

To this end, they compared 66 studies comparing three different moderators:

  1. Path dependence – They determined whether the studies were cross-sectional versus longitudinal. The idea here is that truly unique and valuable skills most likely develop over time and hence can only be discovered through longitudinal studies.
  2. Organisation-specific versus general human capital – Based on the assumption that if employees have valuable but more generic skills, they can move on to a competitor easily, while an employee with organisation-specific expertise adds value for an organisation because these employees are more likely to a) make decisions that are in harmony with the organisation’s unique strategy, organisational context, and competitive environment and b) stay at the organisation, as they can’t easily transfer to another organisation and therefore probably feel more ‘attached’ to (or stuck with ☺) their organisation. They just can’t quit that easily.
  3. Operational versus global organisational performance measures – The authors explain that sometimes, when certain employees create profits, it might not directly show in organisational performance because it’s likely that influential stakeholders such as the employee’s manager use their own power to secure potential profits for themselves (organisation-speak for ‘get a raise’). This means that the result might not directly show in global organisational performance measures. Instead, performance measurements are more likely to show in operational performance measures, such as customer satisfaction or innovation.

How the results help Learning Professionals to look past the end of their own noses

What did Crook and his colleagues find? Overall, the results suggest that human capital is strongly related to organisational performance. However, the idea that this link can only be identified over time (e.g., longitudinally) wasn’t confirmed. This raises the question of whether perhaps organisational success in itself helps attract and retain employees instead of there being just a ‘simple’ one-way relationship between human capital and organisational success.

Tae-Youn Park and Jason Shaw (2013) state that the literature offers dozens of established connections for organisational performance, such as location, strategy, technology, organisational processes, physical resources, and unique products and services, which all contribute to organisational success in one way or another. For the Learning Professional it’s important to be aware of this bigger picture in order to figure out what other elements (apart from human capital) influence organisational performance.

The second finding relates to this bigger picture. It suggests that the relationship between human capital and organisational performance is stronger when human capital is organisation-specific rather than general. Organisations need to find a way to retain their people because it takes an employee quite a while to develop organisation-specific knowledge and skills. Put simply, for employees to be motivated to improve their performance and stay in their job takes more than ‘just’ learning and performance improvement (also, see for example this blog by me). It’s necessary for Learning Professionals to investigate organisational culture and build relationships with employees at all levels in the organisation. That way, it will be possible to look beneath the surface and understand employee behaviours and problems at a more profound level.

Why is all this so critical? It goes back to Quinn’s point that we should move away from being order-takers. It’s absolutely vital to look beyond our own little world so that we’ll be able to first, validate the business or performance problems as stated by, for example a stakeholder. Only when we have validated that the suggested problem is the actual problem, we can determine if a learning experience and/or performance support tool is the best solution to solve the problem. The need for these steps in the learning & development process can’t be overstated. It will be largely detrimental for individual employees and organisations alike if we design experiences or solutions for problems that don’t exist (no need to measure there! Just don’t design them in the first place!) or if we design a wrong solution to an existing problem (this is where the measuring comes in). It might sound like I’m stating the obvious, but I’d be willing to bet good money that all of us Learning Professionals have encountered examples of both. Any bettors out there?

Crook et al’s last finding suggests that operational performance measures indeed correlate more strongly with human capital than global organisational performance measures. So, when including operational performance measures, it’s easier to detect impact. This finding strongly suggests that Learning Professionals need to find the right metrics to measure impact and prove value.

Pulling it all together, it’s about Learning Professionals to a) become effective gatekeepers (prioritise customers and learning and performance challenges), b) to understand the relationship between human capital and organisational performance in order to deliver the most impact to organisational performance. Which then c) needs to be proved through appropriate measurement.

Need I say more?

1 Learning professional in this context refers to any role that deals with any kind of learning experiences and/or performance support mechanisms in an organisational context.
2 Human capital refers to the concept that people possess skills, experience, and knowledge and therefore have economic value to organisations because they enhance productivity (Ramlall, 2004).
Crook, T. R., Todd, S. Y., Combs, J. G., Woehr, D. J., & Ketchen, D. J., (2011). Does human capital matter? A meta-analysis of the relationship between human capital and firm. Journal of Applied Psychology, 96, 443-456. DOI: 10.1037/a0022147
Park, T. J., & Shaw, J. D., (2013). Turnover rates and organizational performance: A meta-analysis. Journal of Applied Psychology, 98, 268-309. DOI: 10.1037/a0030723
Quinn, C., (2017, 8 March) [Blog post]. A ‘field of dreams’ industry. Retrieved from
Quinn, C., (2017, March 1). The change is here [Blog post]. Retrieved from
Ramlall, S., (2004). A review of employee motivation theories and their implications for employee retention within organizations. The Journal of American Academy of Business, Cambridge, 52-63. Retrieved from