Learning Tech Ireland 2017 – Reflection

By Peter Gillis

“Working in eLearning/education in Ireland for nearly 20 years, I have been seeking events like this, and really felt this was the first I’ve attended that really provided useful, research-led and practical advice”

Brendan Strong, Director of Education at Society of Medical Professionals

crowd

At Learnovate, as we enter phase 2, we have been working hard over the last few months to design and develop a value proposition that will see the centre answer the real needs of our customers, create ‘pull’ for our services and deliver real impact. We took the decision to focus this year’s conference around the concepts that underline this value proposition, to give members and the wider community an understanding of how Learnovate can help them realise real impact in the area of Learning Tech. We term our approach Impact Led Innovation designed to combine the best of Lean Start-Up and Design Thinking to identify from the outset, learning problems people really care about, validate these assumptions, develop solutions with end users and identify the sustainable impact that will maximise the potential for success.

This involved us taking a very different approach to our conference, rather than showcasing technologies and discussing themes in the area we chose, for this year, to provide a three tiered structured approach to the day. We knew it was risky, but we also knew it was an indirect and slightly messy way of ‘validating our assumptions’ in relation to our value proposition.

The purpose of this blog is not to recount the excellent presentations, but to deliver prompts for those who were there and a flavor to those who could not make it, we would be happy to follow up in person on any aspect of interest.

speakers

Following introductions and an outline of Learnovate’s Value Proposition to deliver Impact Led Innovation for our members, the first stage was to deliver insights from world class experts in the area of Lean Start-Up, problem validation, identifying what customers need and avoiding the pitfalls that can arise.

Tendayi Viki spoke of the zero correlation between R&D and growth, a point that raised a few eyebrows in the audience, however the point here was that pure R&D has no correlation, it is only when R&D is linked to innovation and customers’ needs that it is effective, a point illustrated by Tom Melia from Enterprise Ireland earlier in the day. Tendayi spoke of avoiding what he refered to as ‘Innovation Theatre’ carrying out the exercises to have the appearance of innovation, while taking your eye off the end prize. Developing a term from Steve Blank, the audience was introduced to his ‘Eight Steps to the Epiphany’ (four more than Steve!), they are:

  • Capture Ideas
  • Identify Assumptions
  • Prioritize Assumptions
  • Brainstorm Tests
  • Falsifiable Hypotheses
  • Get Out Of The Building
  • Capture Learnings
  • Make Decisions

Personally I thought his phrase “It’s not iterating if you do it only once” was one of the more humourous and valuable snippets form an excellent session.

Rob Fitzpatrick focused his Keynote on talking to customers to first ‘Learn’ if a problem you believe exists, exists at all and if so does anybody care! Secondly to ‘Confirm’ that customers will use it, and importantly, if they will pay for it! Rob also addressed the three mistakes that are made when talking to customers, Asking for Opinions, Meetings that go nowhere and Wasting time. Your ego may do well indulging in the activities above, but realistically theu get you no closer to understanding if your idea will be successful or not.

3Panel

The second stage was designed to bring the audience a little closer to Impact Led Innovation through three practitioners who have applied the concept in their projects and lessons learned.
Vivienne Ming talked about how at a high level we need to address the human condition, to have high and ambitious targets if we really want to drive change and pull for our ideas. Vivienne referred to the failings of summative assessment in education to attain such outcomes. With reference to Vivienne’s own work with her EdTech company Socos Vivienne referred to the need to engage with users to identify real problems, and to ensure that there is product market fit, that there is a business demand, otherwise the concept while commendable may not survive to achieve the desired impact.
Paidi O’Reilly talked about the lessons he has learned through the application of Design Thinking and Lean principles in the area of EdTech through his time working with Texuna Technologies in collaboration with UCC. From his experience he identified 12 hard lessons learned, including the importance of how we frame a problem, the importance of identifying needs over wants, and personally my biggest take away of the twelve, behavior is the real issue, in Paidi’s words “We need to accept that behavior is probably bigger than technology”.
Neil Peirce from Learnovate gave a hands-on talk about how our Horizon 2020 project has engaged design thinking. Neil gave a detailed talk to show how different aspects have been used to define customers and their pain points. The stages demonstrated identifying users through persona development, validation of same and subsequent revisions based on feedback from interviews. The follow-up workshop process involved identifing both the “As-is” situation, the pain points for users and subsequently the “To-be” situation to discover what could make it work. This informed the development and definition of User Stories to inform the project. Neil pointed out 5 key learnings for carrying out successful workshops:

  • Diversity is important
  • Plan for social barriers
  • Plan for social norms
  • Experts as facilitators
  • Allow for prep-time & documenting results

When it comes to User Testing Neil also pointed out that his team’s three takeaways were:

  • Structured & repeatable sessions, clear objectives
  • Carefully selected questions
  • Test early, test often

groups

Finally we wanted to allow attendees get up close and personal with the concepts through two parallel hands-on workshops. Tendayi focused his workshop on Strategyzer’s Business Model Canvas. Teams of approximately ten identified hypothesis for a learning technology that they believed would be of value. In rapid fire sessions Tendayi challenged the teams to identify and prioritise the assumptions that lay behind their hypothesis, then to prioritise their most risky assumption and consider how they might test this to validate the idea.
Rob’s workshop looked at ‘customer development’. Based on Rob’s book ‘The Mom Test’, Rob hosted an engaging and interactive session looking at how to get around the comfort of compliments and opinions when talking to potential customers, because basically they will lie to you! How to approach customers asking for concrete feedback on specific issues in their past and identify real problems they have, that they value being solved, and are not being addressed at the moment.

In conclusion Owen White finished the day by clarifying one question. Is Learnovate now an innovation consultancy rather than a Learning Technology research centre? Absolutely not, the Impact Led Innovation capability at Learnovate is to support our ongoing development of solutions in the Learning Tech market, combining our innovation expertise with our existing expertise in Learning Design, UI/UX, Technology and Commercial development.

As I mentioned at the start, ‘Mom Test’ beware but our initial feedback from members and attendees was that we were hitting the right note, the topic resonated and people saw clear value in the approach. Among many positive unprompted messages from from attendees on the day was the following from Brendan Strong, Director of Education at Society of Medical Professionals “Working in eLearning/education in Ireland for nearly 20 years, I have been seeking events like this, and really felt this was the first I’ve attended that really provided useful, research-led and practical advice”

We look forward to assisting our member companies with their Impact Led Innovation projects and to growing the impact our industry has on the world stage.

Learning Technologies Summer Forum

By Janet Benson

JB2a

The Learning Tech Summer Forum (LTSF) is an event which is designed to develop and expand upon the popular themes from the February Learning Tech conference and exhibition, which Mirjam blogged about earlier this year.

The conference part of the forum focuses on learning in practice, and featured topics such as emerging technologies, collaborative learning, practical social learning and user experience. The associated exhibition contained over 30 seminars and an exhibition area with nearly 40 exhibitors from research companies such as Towards Maturity to learning tech companies and learning industry suppliers such as eLearning Studios and the Fosway Group.
With a day as packed as this, choosing which events to attend can be difficult, and I found myself wishing for a time-turner from Harry Potter to enable me to optimise my day.

Learning and the Brain

jb2c

After the welcome note from organiser Donald Taylor, the opening address was delivered by Dr. Itiel Dror, a neuroscientist at UCL, and dealt with learning and the brain (http://www.cci-hq.com/home.html). A very funny and engaging speaker, Dr. Dror highlighted that ‘REAL learning is difficult’ while discussing 3 critical (and intertwined) perspectives of real learning at work; namely:
• Acquire – learners need to understand to be able to learn
• Memory – learners need to be able to retrieve information in the long run
• Apply – learners need to be able to apply what they have learned when they return to the workplace

Itiel discussed how the brain has limited resources and that we need to help the brain to learn by guiding it to the right places where possible. He also stated on numerous occasions that “the human mind is not a camera” and that it is active and often requires rewiring in order to change how we do things (we are creatures of habit).

speakersLT17

The now and the next of learning and technology

The first seminar I chose to attend was given by David Kelly of the eLearning Guild and was entitled ‘Emerging Technologies – the now and the next of learning and technology’.
David talked about disruption, how disruption itself is neither positive nor negative, but it’s how we react to it that counts. He spoke about how we need to always look at the risks of doing things, as well as at the potential benefits, something which I believe we need to do more with regard to learning and technology and to ensure we don’t jump on the bandwagon of new techs and fads. He suggested approaching technology with a sense of play, and recommends looking at how technology is changing how people live before we look at how technology might impact on how people learn.
On the future of learning and technology, David Kelly discussed interactive video, virtual reality, augmented reality and wearable technology, and referred to a nice recruitment video from Deloitte: http://www.raptmedia.com/customers/deloitte/.
Similar to ‘Choose Your Own Adventure’ books that I enjoyed as a child, interactive videos look very slick and allow the user to make decisions, but may be expensive to develop and require a certain level of video editing skills as well as access to the relevant software to be effective.
David referred to Karl Kapp in his discussion of game-based learning and Kapp’s view that we cannot implement this type of learning using the linear instructional design approach. Having read Kapp’s book, ‘The Gamification of Learning and Instruction’, I would agree with Kapp’s view on this; however, this is the case with a number of learning approaches and we as learning experience designers need to constantly evolve and upskill.

Micro Learning?

I was interested to learn what Clive Shepherd and Barry Sampson had to say on the topic of micro learning so attended their afternoon talk on the topic.
I am not sold on the notion of ‘micro learning’ and feel that it is simply a buzzword for something that is, essentially, just learning. It is commonly known that learning should be broken into smaller chunks where possible to avoid the risk of ‘cognitive overload’ in learners and the use of videos, infographics, etc. is simply a way of doing this, rather than making it ‘micro learning’.
Clive and Barry defined micro learning as ‘a way of organising self-directed learning into lots of small chunks’ and referred to ‘how-to’ videos as the most common type of micro learning. I believe the word ‘small’ in this definition can be subjective and again, the use of short videos is merely a way of learning, not a type of learning in itself.
The speakers did admit that micro learning cannot be applied to any learning problem and is not inherently motivating, while also stating that spaced practice and repetition are advantages of micro learning. Surely spaced practice and repetition can be applied to most learning scenarios and are not only advantages of so-called micro learning?
I won’t overstress my position on micro learning, but needless to say, I’m still not sold.

UX/LX

My final seminar of the day was delivered by Myles Runham, an independent learning consultant, and concerned the area of User Experience: ‘Why it’s fundamental and how to make it work’.
Myles encouraged the attendees to share their favourite websites and digital products and highlighted that user experience is the primary reason why people switch systems. He also shared the top ten learning tools from 2016, with YouTube at number one and Google Search and Twitter in the second and third spots, respectively.
I agree with Myles’ opinion that user design and learning experience design are not really any different, and I believe as an LX designer that we can use Myles’ 3 rules for a good user/learner experience:
• Simple – know what it’s for/what’s expected
• Consistent – easy to move through/work with
• Standard – familiar/obvious
Myles reiterated that point that we need to focus on our users (learners) and what they need as well as getting to know them and bringing them into the planning and development process. Stakeholders may be focused on a different problem, and this is a key takeaway for me as we move into Learnovate Phase II and our new value proposition.

My Takeaways?

Regarding my personal focus on learning experience design, the key takeaways for me were how most training is really re-training and that we need to analyse how similar or different the previous information is that we want to ‘rewire’.
Also, external motivation such as targets and performance reviews can often get in the way of learning by way of corrupting motivation. We need to get our learners on board before we can even begin to hope for change to occur and where possible we need to give them an experience.
As Dr. Dror says, “it’s not what you teach but what they learn that counts”.
As I had to run for my flight, I missed Donald’s closing address, but with a number of takeaways to bring back to Learnovate, I was satisfied that my attendance at this conference was worthwhile for both me and for the centre and am already looking forward to next year.

You think you have a great idea! But what would your mom think?

webPage-01

Most innovative ideas start with a spark, an idea that will transform learners, performance and the business. In many cases, friends (your Mom!) and potential customers are asked if they think it is a great idea? Next, with positive feedback, a product/service development project can kick-off. Later everyone is surprised when in the vast majority of cases the final result is not successful.

Rob Fitzpatrick, author of The Mom Test, is an expert in ‘how to talk to customers when everyone is lying to you’. Join Rob at Learning Tech Ireland 2017 where he will share his insights on how to develop innovation ideas using proven techniques. ,“Learning from customers is critical for building new products, but customer feedback is notoriously unreliable, especially pre-launch when it matters most. We’ll look at how to separate the fluffy compliments from the real data (and buying signals), ensuring you get more value out of the time you spend learning from and selling to your customers.”

To book now click here

rob_book

For more information on the event, speakers, and to book visit our web page
http://www.learnovatecentre.org/learning-tech-ireland-2017/

If you have any questions relating to the event, please do not hesitate to contact me or any of the team here at Learnovate. Look forward to seeing you on the 28th!

Best
Peter

Peter Gillis
Learnovate Centre

EARLI Conference Report

Author: Mirjam Neelen

I went to the EARLI special interest group (SIG) 27 conference to present a poster on the DEVELOP project. SIG 27 deals with online and more objective measures of learning processes. The idea is that learning processes are very important to understand because learning is an on-going process leading to an outcome. The same goes for assessment; that requires considering the process as well. In other words, it’s shouldn’t be so much about “what” but more about “how” and “why”.
The conference focuses on the state-of-the-art as well as innovative approaches to measure the process of learning. It also looks at innovative solutions to analyse data. The main question for the DEVELOP poster was: How can you objectively measure learning processes if you 1) don’t have control over the objectives, content, and quality of the learning interventions (after all, DEVELOP will work with a trial partner’s learning interventions) and 2) don’t know if the user has completed a learning intervention?

devPoster
DEVELOP’s poster

I’m far from an experienced “academic conference participant” but I was very interested in this particular one because of the theme. I was eager to discover what is happening in this space and I was curious how big the gap is between the types of evaluations and measurements that currently take place in the workplace learning space and this academic community that focuses on measuring learning processes in an objective manner. I wasn’t 100% sure what these “objective measures” included, although things like “eye-tracking” and “fMRI” crossed my mind. In the light of the DEVELOP project, I was hoping to find out if there are possible objective learning process measures that we might be able to implement in the project.

Structure of the conference

First of all, I found the formats quite interesting. Although the conference offered traditional paper and poster presentations, there was also a flip-the-paper format in which contributors delivered a 5 minute pitch on their research and then the participants picked the one they were most interested in. This way, the conference audience broke into smaller groups, which then allowed for a lot of interaction. My favourite format was the no-or-not-perfect-data sessions in which the presenter shared the study at hand in about 10 minutes, followed by a 10 minute discussion. The fact that it was so clear beforehand that the study was far from perfect enabled very fruitful and constructive interactions.
It’s really not possible to give an overview of all the sessions because there were around 90 in total and I attended about one third of them. Let me give first paint the overall picture and then I’ll give some examples.

Themes

There was a strong focus on self-regulated learning (SRL), which can be defined as “the active, constructive process whereby learners set goals for their learning and attempt to monitor, regulate and control their cognition, motivation, and behaviour, guided and constrained by their goals and the contextual features in the environment” (Endedijk et al., 2006, p 3) and how to measure these learning processes objectively, such as through eye-tracking, physiological data (e.g. fMRI, facial expressions, and wearables that measure things like electro dermal activity, heart rate, temperature, and acceleration of movement), wearables that measure social interaction in offline settings, such as sociometric badges and online log files. Often times these objective forms of measurement were combined with more subjective ones such as questionnaires or interviews, or video recordings of participants who were, for example collaborating on a task. An interesting research method within this context is cued retrospective reporting because if you have more objective measures you can ask more focussed questions based on those.

The domains in which this type of measurements were conducted varied widely. There were sessions on reading, writing, physics, and math in primary and secondary school settings but also ones in the adult learning space, for example on collaborative writing, team work in general, civil engineering, and so forth.
As Paul Kirschner summarised, learning is cognitive (which includes processes that fall under what others would call “metacognitive”), affective/emotional, motivational, and interactional. All these aspects of learning were covered during the conference as well.

  • First, context is critical. Without it there is no way to interpret your data and to extract any meaning.
  • Second, it’s about combining situational variables and data and last, it’s about aligning all elements, which is very challenging.

Keynotes

I will give some examples of the sessions that I attended. The first was from Professor Guoying Zhao from the University of Oulu who discussed reading hidden emotions through micro facial expressions and heart rate estimation. It was a very detailed, technical, and fascinating talk. It’s quite incredible what they can see with their technology; emotions that remain hidden for the human eyes.

keynote1
Keynote Professor Guoying Zhao

The second was from Professor Roger Azevedo from North Carolina State University on multimodal, multichannel process data to measure and foster SRL in real-time with advanced technologies. Azevedo presented the major theoretical, methodological, and analytical challenges that come with using this type of data.

Keynote2
Keynote Professor Roger Azevedo

He also discussed recently used multimodal multichannel data used to detect, track, and model SRL processes while learning with several advanced learning technologies (ALTs). Lastly, he also outlined the learning principles designed to enhance ALTs’ capability to provide real time, intelligent support of learners’ SRL processes. Azevedo gave many examples, such as using personalised eye-tracking visual attention feedback on a construction site in order to identify what hazard stimuli that received or did not receive attention. This information can also be provided as feedback to workers to communicate search process deficiency, trigger self-reflection processes, and improve subsequent hazard search performance (Jeelani, Albert, & Azevedo, 2016).

Workplace learning

The other sessions that I attended covered a wide range of domains, however I would like to focus on the ones that discussed studies in a workplace learning context. One example of a no-to-not-perfect data session is Wijga’s and Endedijk’s study. It focussed on using process mining to develop a dynamic model of self- and social regulation in relation to team performance in the workplace.

Keynote3
Wijga’s no-to-not-perfect data session

I was particularly interested in this session because although we all seem to accept that the regulation of learning in teams contributes to enhancing team performance, we’re not really sure how it happens and how to facilitate or support it. Wijga and Endedijk have tracked ICT teams for several months. They videotaped team meetings and individual team members’ kept diaries as well. The researchers aim to extract a dynamic model of the relation between the quantity and quality of self- and social regulation of teams in relation to team performance.
Self-regulation includes four flexibly sequenced phases of recursive cognition. These phases are:

  • task perception,
  • goal setting and planning,
  • enacting,
  • and adaptation (Winne & Hadwin, 2008).

Social regulation can be defined as ‘the processes by which multiple others regulate their collective activity. From this perspective, goals and standards are co-constructed. Socially shared regulation is collective regulation where the regulatory processes and products are shared’ (Hadwin & Oshige, p.253, 254).

Workplace social interaction

I was also very interested in Endedijk’s, de Laat’s and Ufkes’ flip-the-paper session in which Endedijk discussed their study which monitored social interaction in a team in a healthcare context in order to identify to what extent this could be used as a proxy for informal social learning. Endedijk and colleagues continuously tracked the location of team members using WiFi tags and sociometric badges. They then used social network measures, such as density, centrality, and degree of brokerage to calculate the dynamics of social interaction patterns. Their research in still in its early phases, however the idea is that they can qualify the content of the interaction so that they’ll hopefully find some evidence for informal social learning instances. This research could be wonderfully combined with other types of social network analysis and ideally this would be analysed in the light of (team) performance or, in DEVELOP’s case, in the light of competency or career development.

Summary

Although I found many of the sessions very niche and theoretical, for the workplace learning ones I could envisage valuable practical implications (it might be because workplace learning is my own expertise, who knows ☺). The type of research that Azevedo, Endedijk, and Wijga were discussing can really help to determine what type of interactions in what type of contexts in the workplace actually support learning. If we’re able to identify patterns that way, we could also find ways to support the interactions that matter most for learning and/or performance.
Long story short, I left inspired and somewhat overwhelmed, and for a minute I even considered going for my PhD. Dark Finland playing tricks with me, I guess.

snowy

References
Endedijk, M., Brekelmans, M., Sleegers, P., & Vermunt, J.D., (2006). Measuring self-regulation in complex learning environments.
Hadwin, A.F., & Oshige, M., Self-regulation, co-regulation, and socially-shared regulation: exploring perspectives of social in self-regulated learning theory. Teachers College Records, 113, p 240-264.
Jeelani, I., Albert, A., Azevedo, R., & Jaselskis, E.J., (2016). Development and Testing of a Personalized Hazard-Recognition Training Intervention. Journal of Construction Engineering and Management, DOI: 10.1061/(ASCE)CO.1943-7862.0001256. Retrieved from http://ascelibrary.org/doi/pdf/10.1061/(ASCE)CO.1943-7862.0001256
Winne, P.H. & Hadwin, A.F. The Weave of Motivation and Self-Regulated Learning. In Schunk, D.H., & Zimmerman, B.J. (2008), Motivation and Self-Regulated Learning: Theory, Research, and Application (pp. 297–314). New York, NY: Routledge

Gold & Silver for Learnovate at Global Awards!

reimagine1

We are thrilled to announce that our Research Project ‘Feedback’ (originally Business Competence Analytics) won Gold in the Category “MBA and Professional Education” domain and went on to pick up Silver in the “Nurturing Employability” category at the Reimagine Education Awards in Philadelphia, USA. The awards attracted over 500 submissions across all categories this year.

reimagine3
Evangelos Kapros from Learnovate was at the ceremony to pick up the awards.

These awards are the result of two years hard work by the team at Learnovate and the contributions from many of our industry members. Congratulations to all!

About Feedback

Our mobile peer feedback app allows for contextualised, continuous formative peer assessment of transversal competencies based on near to real-time on-the-job performance. One of the key innovations of the app is that the feedback process supports more evidential feedback methods by using behavioural anchors to reduce subjectivity in the data. Through the peer feedback, competency data is captured regularly, analysed and visualised. This not only gives relevant, timely, and actionable insights to the employee, it also delivers more accurate data to the organisation.

About Reimagine Education
In 2014, QS Quacquarelli Symonds entered into a partnership with the Wharton School at the University of Pennsylvania to launch the first global competition designed to identify the most innovative, novel approaches to higher education. Reimagine Education is the annual global awards conference for innovative higher education pedagogies enhancing learning and employability.