The API economy: why does it matter

Recently we posted two articles based on Martyn and Jonny’s presentation at Learning Technologies 2016. They were
APIs can disrupt Edtech – 2 key ingredients for success and
3 key factors when designing APIs for Learning.

Screen Shot 2016-03-22 at 16.53.04

If you did not get an opportunity to go to their presentation in London, you can view it here

Job opportunity – Research Assistant/Project Coordinator

Post Research Assistant/Project Coordinator
Contract Duration 2+ years Part-time or Full-time
Department Learnovate Centre, School of Computer Science and Statistics
Salary The appointment will be made at €30,200 – €46,204 (full-time) depending on experience
Closing Date: April 4th 2016

Summary

Candidates are invited to apply for the position of Research Assistant/Project Coordinator in the Learnovate Centre. The ideal candidate will have excellent facilitation, communication, and influencing skills, as well as insights into one or more of: educational technology, game-based learning, learning design, or pedagogy.  A team player and self-motivated, the candidate will demonstrate a strong team ethos and an ability to develop strong relationships with relevant stakeholders and ensure their needs are met.

The successful candidate will work on applied educational technology research projects including:

The DEVELOP project; an EU-funded Horizon 2020 project focussing on career development in large companies. This project will create a personalised learning environment for career development that will provide game-based assessment of transversal competencies and social capital to highlight learning opportunities for career development. It will combine this with personalised visualisations of potential career paths to inform and guide learners towards realistic and attainable careers.

Principal Duties:

  • Working with a multi-disciplinary team to define, scope and execute demonstrators, prototypes and bespoke projects
  • Creating and managing schedules, resources and budget
  • End to end coordination of multi-partner, complex software projects
  • Working with the Programme Manager to prioritise and maximise resource and time usage across multiple projects
  • Background research on one or more of: educational technology, game-based learning, learning design, or pedagogy
  • Contributing to academic publications and research reports
  • Contribute actively to research projects, including requirements gathering, design, implementation, and evaluation
  • Working with industry partners to provide detailed specifications and requirements
  • Creation and maintenance of project documentation and processes in line with best practice
  • Communicating project progress, information and outputs
  • Presenting and reviewing projects with industry partners

Key Requirements

  • Min 1 years experience in project coordination, personnel management, or relevant research activities
  • Experience in working as a consultant with internal and external customers
  • Proven stakeholder management skills
  • Excellent verbal communication and interpersonal skills
  • Proven facilitation, collaboration and influencing skills
  • Ability to operate with high flexibility in a constantly evolving team environment
  • Proven ability to prioritise own workload and work to exacting deadlines
  • Strong team player
  • Experience in using standard project coordination tools including commercial offerings, MS Project, Excel etc

Desirable

  • Degree in Computer Science, Education, Psychology or related discipline
  • Research experience in one or more of the following: educational technology, game-based learning, learning design, or pedagogy
  • Experience in applied learning technology research
  • A sound knowledge of software development and testing best practice
  • Experience working in a research driven working environment
  • Experience in process improvement and change management
  • Strong working knowledge of project management practise such as Prince 2, PMBOK
  • A proven track record in software project delivery in EdTech / Software environment
  • Experience with Agile methodology

Please Note: No expenses will be paid in travelling to interview.

Background to the Post
This is an opportunity to join an exciting research team, based in the Learnovate Centre in Dublin’s “silicon dock”.  We have recently secured significant funding for projects in the areas of corporate career development, and innovate web services/API models for the EdTech sector.

Department Summary
The School of Computer Science and Statistics at Trinity College Dublin offers a high quality educational experience to undergraduate and postgraduate students along with exceptional research opportunities for professionals in the fields of computer science and statistics. The school is proud of its reputation as an innovative and energetic centre for study and research.
The research interests of the school are wide and varied, ranging from the theoretical to the practical. The schools researchers are at the cutting-edge of their disciplines, working on prestigious research projects with other professionals in their fields, and with access to significant public and private funding and industry support.

Trinity College Dublin
Founded in 1592, Trinity College Dublin is the oldest university in Ireland and one of the older universities of Western Europe. On today’s campus, state-of-the-art libraries, laboratories and IT facilities, stand alongside historic buildings on a city-centre 47-acre campus. Trinity College Dublin is currently ranked 43rd in the top world universities by the Times Higher Education Supplement Global University Rankings 2009 and 13th in Europe. Trinity College Dublin offers a unique educational experience across a range of disciplines in the arts, humanities, engineering, science and human, social and health sciences. As Ireland’s premier university, the pursuit of excellence through research and scholarship is at the heart of a Trinity education. TCD has an outstanding record of publications in high-impact journals, and a track record in winning research funding which is among the best in the country. TCD has developed significant international strength in its research in eight major themes which include globalisation; cancer; genetics; neuroscience; immunology and infection; communications and intelligent systems; nano and materials science as well as Irish culture and the creative arts. TCD aims to become the world reference point in at least one of these areas of research in the next 10 years

Application Procedure

Candidates are asked to submit a covering letter and a full CV to include the names and contact details of 3 referees (email addresses if possible) to:

Dr Martyn Farrows, Centre Director,
Learnovate Centre, Unit 28, Trinity Technology and Enterprise Campus, Pearse Street, Dublin 2.
Tel: +353-1-896-4912
Email: martyn.farrows@learnovatecentre.org

Edtech Innovation Ecosystem

The 2016 Research and Innovation Conference & Exhibition on the 3rd March 2016 in Croke Park saw over 1000 leaders in research and innovation from the leading industries in Ireland as well as key government bodies, research bodies gather for a busy day of seminars and exhibitions.
The focus of the event is Agenda 2020 the ambitious plan to position Ireland as a global knowledge leader and a major hub of scientific and engineering research.

PG_Innovconf

Learnovate’s Peter Gillis was speaking at the event to share with the audience how the success of the Edtech industry in Ireland is being supported by the development of an innovation ecosystem. The talk centred on five key components that have supported its growth.

  1. A market worth chasing
  2. Ireland’s Track record
  3. A vibrant start-up scene
  4. Favourable Policy Environment
  5. Innovation and export support

You can see Peter’s presentation here

Research Findings in Game Based Learning

Learnovate’s Neil Peirce presented at the 4th meetup in the Learning Tech Labs series. The Meet-up was focused on Gamification!

NeilPeirce

Looking to answer the questions:

  • What is Gamified learning?
  • Does it improve engagement rates, and knowledge retention?
  • Does it have a measurable impact on performance?

Also speaking at the well attended event were Tony Riley from GotchaNinjas, and Stephen McManus from Riptide Academy

Download a PDF of Neil’s Slides

 

About Neil:
Neil holds a degree in Computer Science, a Masters in Multimedia Systems and a PhD from Trinity College Dublin (TCD). Neil’s PhD research focused on the personalisation of learning experiences within educational video games. He is currently the Technology lead in Learnovate where he is heading up a Horizon 2020 project on creating a personalised learning environment and he is chairing this year’s Game Based Learning Conference.

Improving Performance Improvement: Why it’s tough going

By Mirjam Neelen

This blog is the first in a series on improving performance improvement. The idea is to explore various components, such as metrics or employee engagement that come into play when trying to achieve this. In this blog, I discuss some challenges for both organisations and individual employees with performance management processes (PMPs): Why don’t they support performance improvement as intended?
pmp
Many organisations say that they focus on fostering learning and professional development as part of their PMP because of the growing need for highly skilled employees with up-to-date competencies (Van der Rijt et al., 2012). Ideally, the primary purpose of a PMP is to document employee performance and provide feedback regarding task performance and how to improve it (Budworth et al., 2015).

Unfortunately, PMPs are frequently ineffective in that they don’t improve employee job performance and may even negatively affect job satisfaction (Budworth et al., 2015). For example, Buckingham and Goodall (2015) from Deloitte recently reported that, based on a public survey that they conducted, 58% of the executives questioned believe that their current performance management approach drives neither employee engagement[1] nor performance improvement.

So, why don’t PMPs do what they’re intended to do? Why don’t they support performance improvement?

It’s rather strange that there’s no literature expressing or acknowledging how difficult it is to explain what performance improvement actually means.

In general, performance improvement is about “impact”. Though there aren’t any set standards of metrics and key performance indicators (KPIs) to measure impact, some have to exist for each individual whose performance will be evaluated. A good starting point is to ask the right questions. In other words, an organisation needs to ask the right questionsthroughout the organisation so as to find its own metrics and then give them meaning. While it’s no easy task to give meaning to what impact really means for an organisation, that’s where the effort within the organisation should go (Kapros[2], personal communication, 14 December 2015).

Apart from asking the right questions and figuring out exactly what performance improvement and impact mean, there are more hurdles. One flaw in current PMPs is that performance reviews don’t take place often enough; usually only once or twice a year. Also, these (bi-)annual performance reviews are based on what the calendar says and not on the performance of a task. This makes them both ad-hoc (the right thing at the wrong time) and decontextualised (what’s the basis of the review?). Another issue is the, let’s call it friction between, ‘performance review’ and the actual improvement; that is ‘learning and professional development’. Telling someone where they stand – performance – is one thing, but how do they know what to do to get better? Will they be rewarded for efforts to improve? And how willing are people to learn – which sometimes involves FAILING – if they know their performance is going to determine their bonus?

In the same sense, performance reviews are usually performance appraisals. Your peers know that the feedback that they are going to give you is going to influence your future and your bonus, and therefore they probably won’t be as honest or blunt. Or, the other way around, they might be hesitant to be too positive about your performance; what if you get promoted and they don’t?

And it’s not only that. It’s also that employee performance appraisal data are commonly used to identify who is a key performer and who performs below average. These metrics are often inaccurate, for example, because of the fact that they are ad hoc and decontextualised or worse, if the interpretation of the metrics is completely subjective (for example, how I interpret “effective communication” might be different from your interpretation of its meaning), particularly in knowledge intensive environments since meaningful, tangible metrics are harder to identify (Whelan et al., 2011).

Accurately measuring performance, is an even more difficult nut to crack and so far there don’t seem to be any convincing nut-crackers out there. Maybe some ways exist, but we haven’t found them yet. It almost feels as if we’re only at the stage where we need to find the right questions to ask first, rather than look for answers. Big multinationals tend to use rigid competency models and frameworks and the question is if these are helpful.

Furthermore, while there are many performance management technologies to choose from on the market, we need to acknowledge that no technology is going to solve the current PMP challenges, ever. Of course technology can help solve bits and pieces, like competency assessment in immersive learning environments, self-reflection tools, electronic performance portfolios, or (peer) feedback tools. Of course, it’s worth exploring the extent to which technologies could support an effective PMP. However don’t get your hopes up too high—don’t ever expect they will solve all of the problems (Kapros, personal communication, December 14 2015).

It’s clear that it’s far from easy to improve performance improvement. But there’s hope and there are many components to explore, which I intend to do step-by-step in a series of blogs. Off we go, hopes up high.

REFERENCES
Buckingham, M., & Goodall, A., (2015). Reinventing Performance Management. Harvard Business Review.

Budworth, M.H., Latham, G.P., & Manroop, L., (2015). Looking forward to performance improvement: A field test of the feedforward interview for performance management.Human Resource Management, 54, p. 45-54.

Kapros, E., personal communication. http://ekapros.eu/index.html

Van der Rijt, J., Van de Wiel, M.W.J., Van den Bossche, P., Segers, M.S.R., & Gijselaers, W.H., (2012). Contextual Antecedents of Informal Feedback in the Workplace. Human Resource Development Quarterly, p. 233-257.

Whelan, E., (2011) It’s who you know not what you know: A social network analysis approach to talent management. European Journal Of International Management, 5 (5):484-500.

[1] It must be noted that there is no single and accepted definition for the term ‘employee engagement’ but it is about passion and commitment, and the “willingness to invest oneself and expand one’s discretionary effort to help the employer succeed” (Markos & Sridevi, 2010, p. 90).

[2] http://ekapros.eu/index.html