Category Archives: research

Thoughts on: ‘Sleeping with the enemy’: how far are you prepared to go to make a difference? A look at the divide between academic and allied staff (Wohlmuther, 2008)

At this stage of looking at the matter of professional staff and academic staff in Higher Education, I feel that I’m somewhat flogging a dead horse and everything that needs to be said, has been said. So why am I still looking at this paper? Initially I was concerned that it grated on me because it doesn’t fit with my current narrative that there are significant cultural factors in universities that make it unnecessarily difficult for professional staff – particularly those in education support roles – to be heard when it comes to discussing teaching and learning.

If this was the case, I’d clearly not being doing my best work as a scholar – open to new information and willing to reconsider my world view in the face of it. Having looked over the paper a few times now though, I have to say that I think it’s just not that great a piece of research. A number of assertions are made that simply aren’t supported by the evidence presented and some of the reasoning seems specious. Events from four years prior to the publication date are referred to in the future tense but there is no discussion of whether they happened or what the consequences were.

Assuming that this is poor research – or perhaps poor analysis – it makes me happy that I’ve reached a point where I can identify bad work but also a little concerned that I’m wrong or I’m missing something because this was still published in a peer reviewed journal that I’ve found a lot of good work in previously. (Then again, I assume that most journals have their own favoured perspectives and maybe this was well aligned with it). I searched in vain to find other writing by the author but she appears to be a ghost, with no publications or notable online presence since the paper came out.

In a nutshell, based on an anonymous online survey of 29% of all staff – academic and professional at her institution, which included questions about demographics, perceptions of the nature of their roles, the ‘divide’ and the value of different types of staff in relation to strategic priorities, the author concludes that there is minimal dissension between academic and “allied” staff and most of what little there is, is felt by the allied staff.

Now it’s entirely reasonable that this may well be the case but there are a few elements of the paper that seem to undermine the authors argument. Wohlmuther asks survey participants about their perceptions of a divide but doesn’t dig directly into attitudes towards other kinds of staff, which McInnis (1998), Dobson (2000) and Szekeres (2004) all identified as central factors. She looks at the perceptions of contributions of academic and allied staff members to the strategic goals of the organisation which obliquely explores their ‘value’ within the organisation but it seems limited. Given the ambiguous value of some higher level strategic goals (Winslett, 2016), this would seem to tell an incomplete story.

The greatest weakness of the paper to my mind is that ‘allied’ and ‘academic’ work roles are unclear.

Survey respondents were asked what percentage of their time they spent on allied work and what percentage of their time they should spend on allied work. The term ‘allied work’ was not defined. It was left to the respondent to interpret what they meant by allied work (p.330)

With no further examination of the responses via focus groups or interviews, this alone (to me anyway) seems to make the findings murky.

She found that only 29% of staff – all staff? that is unclear – felt that there was “good understanding and respect for the significance of each others roles and all staff work well together” (p.331) across the institute, however doesn’t take this to be an indicator of division.

Looking over the paper again, these are probably my main quibbles and perhaps they aren’t so dramatic. This tells me that I still have a way to go before I can truly ‘read’ a paper properly but I’m on the way

 

Research update #5

This is just a quick one because I’m getting on a roll and am going to try to skim read 6 papers this weekend and properly read one that I’ve already started. I’ve been quite conscious of the fact that while I’m doing some good (it seems) deep reading, it’s taking a fair while and looking at the bibliographies even in journal papers makes me mindful of the fact that coming up with a useful (and read) list of 50+ papers requires a little getting the lead out.

Happily, I’ve found a contemporary paper (2016) by Greg Winslett of UNE that I’m hopeful will give me a recent take on the issues addressed in the three papers I looked at from the turn of the century. (There are also a host of recent citations that seem pretty pertinent)

Winslett’s paper – still from the Journal of Higher Education Policy and Management (I’m worried about drawing too often from the same journal well) but what can you do – is about “The struggle to satisfy need: exploring the institutional cues for teaching support staff”

I like two things about this already – the term teaching support staff seems more suitable than the “education support staff” that I’ve been favouring, (although I am sad to lose the ESP acronym) – and the fact that this is about how TSSs can take guidance from university strategies. (We’re in the middle of a strategic revamp at present, so there’s much to think about)

I did also quite like the fact that a paper co-written by my supervisors Peter and Lina was cited. There was a funny moment of “oh, I know them”

I’m also mindful of the fact that I’m leaning very heavily on papers about and writers from the Australian Higher Education sector. I think I’m ok with this for now but will probably need to consider this in the way that I shape my research questions.

My cool uber-boss, our Associate Dean Education (hi Bronwen) mentioned that I’ve been tweeting a lot about the professional/academic staff divide lately. I felt compelled to clarify that I wasn’t trying to make any particular point or that I have any issue, it’s just where my research is sitting at the moment – and I guess I’m noticing more when other people are tweeting about it.

(I’m scheduled to move on to Unis as Organisations next Friday – I’m not 100% clear what I mean by this but I think it includes education ecosystems among other things). My way of thinking is also such that I’m most interested in the search for solutions than dwelling on any possible issues in terms of any divide or tensions between academics and professionals. The way I see things, we are where we are and that part can’t be changed but by trying to understand it, we can see which bits are working and which can be improved .

I suspect this isn’t going to be the last time that thinking critically about academia in an academic way raises eyebrows.

Research update

Things are definitely feeling better in researchland – this weekend I’ve read 3 papers and 2 blog posts, have blogged about the post and have another post brewing that will capture key ideas from the three papers.

I think my choice of papers has helped me here, I’ve been looking at the divide between professional and academic staff in higher education and this has been a comparatively theory-lite experience, with far less epistemology and pedagogy to unpack than normal.

The papers are all at least a decade (and one is closer to two) old and have left me asking regularly – ok well that’s pretty interesting but where are we now? Have your promised or hoped for changes eventuated or has the academe stubbornly dug in?

Perhaps it was the papers that I chose/found but the role of Education Support People/Professionals is barely even acknowledged and this certainly gives me thing to move on with. Developing a broader understanding of attitudes and the impact of external changes (largely governmental in these papers in terms of higher expectations for accountability and professionalism) has definitely given me a greater feel for the environment and issues.

I have noted that all three papers came from the same journal, the Higher Education Policy and Management, which seems like a logical place for discussions of the operational side of universities and I’ll be interested to see whether the question of the role and value of professional staff is considered in journals relating to other aspects of Higher Ed.

I have a meeting booked with my supervisor on Thursday and I’m feeling like I might even have something of substance to discuss, seemingly for the first time in a while. This isn’t to say that the other meetings weren’t productive but I feel much more like I’ve been doing proper scholarship this time around.

Research journals relating to education in business and economics

Sometimes you need to spend hours poring over a list of 20,000+ academic journals looking for those related to education in business and economics. I’d advise against it.

Here are the ones that I found, so you don’t have to.

I’m not a discipline specialist, so I can’t speak to quality but I’ve included their ratings which will hopefully help.

Enjoy.

 

Academy of Management Learning and Education (A*)

http://aom.org/Publications/AMLE/Academy-of-Management-Learning—Education.aspx

Statistics Education Research Journal (B)

http://iase-web.org/Publications.php?p=SERJ

The International Journal of Management Education (C)

http://www.journals.elsevier.com/the-international-journal-of-management-education

The Journal of Economic Education (B)

http://www.tandfonline.com/loi/vece20#.V47SWfm7hBc

Global Perspectives on Accounting Education (C )

https://www.questia.com/library/p62212/global-perspectives-on-accounting-education

International Review of Economics education (C )

http://www.journals.elsevier.com/international-review-of-economics-education/

Issues in Accounting education (A)

http://aaajournals.org/loi/iace

Journal of Accounting education (A)

http://www.journals.elsevier.com/journal-of-accounting-education

Journal of Applied Finance: Theory, practice, education (B)

http://catalogue.nla.gov.au/Record/7108453?lookfor=isn:1534-6668&offset=1&max=1

Journal of Business Ethics education (B)

http://www.neilsonjournals.com/JBEE/

Journal of Economics and Finance Education (C )

https://www.economics-finance.org/jefe/jefe.html

Journal of Education for Business (C )

http://www.tandfonline.com/toc/vjeb20/91/5

Journal of Financial Education (B)

http://jfedweb.org/toc.html

Journal of International Business Education (C )

http://www.neilsonjournals.com/JIBE/

Journal of Management Education (B)

http://jme.sagepub.com/

Journal of Marketing Education (B)

http://jmd.sagepub.com/

Journal of Statistics Education (B)

https://www.amstat.org/publications/jse/

Marketing Education Review (C )

http://www.marketingeducationreview.com/

Technology innovations in Statistics education (C )

http://escholarship.org/uc/uclastat_cts_tise

 

Thoughts on: Two guides from Ako Aotearoa on education projects and researching learners

It was always my intention that researching in the area that I work in would help me to shape my professional practice (and it is) but I’ve been surprised lately at how much things are flowing in the other direction. I’ve been thinking a lot lately about what is needed to make an educational project successful and how we know that learners have actually benefitted.

This is partially coming from the big picture work that I’m doing with my peers at the university looking at what we’re doing and why and partially from my own college, which has recently launched a Teaching and Learning Eminence Committee/project to look into what we’re doing with teaching and learning. I wasn’t initially invited onto the committee, (it’s all academics), which speaks to some of the ideas that have been emerging in some of my recent posts (the academic/professional divide) as well as the fact that I need to work on raising the profile of my team* and understanding of our* capacity and activities in the college.

Anyway, while trawling through the tweetstream of the recent (and alas final) OLT – Office of Learning and Teaching – conference at #OLTConf2016, I came across a couple of guides published recently by Ako Aotearoa, the New Zealand National Centre for Tertiary Teaching Excellence, that fit the bill perfectly.

logo

One focusses on running effective projects in teaching and learning in tertiary education, it’s kind of project managementy, which isn’t always the most exciting area for me but it offers a comprehensive and particularly thoughtful overview of what we need to do to take an idea (which should always be driven by enhancing learning) through three key phases identified by Fullan (2007 – as cited in Akelma et al, 2011) in the process of driving educational change – initiation, implementation and institutionalisation. The guide – Creating sustainable change to improve outcomes for tertiary learners  is freely available on the Ako Aotearoa website, which is nice.

I took pages and pages of notes and my mind wandered off into other thoughts about immediate and longer term things to do at work and in my research but the key themes running through the guide were treating change as a process rather than an event, being realistic, working collectively, being honest and communicating well. It breaks down each phases into a number of steps (informed by case studies) and prompts the reader with many pertinent questions to ask of themselves and the project along the way.

The focus of the guide is very much on innovation and change – I’m still thinking about what we do with the practices that are currently working well and how we can integrate the new with the old.

The second guide – A Tertiary practitioners guide to collecting evidence of learner benefit – drills down into useful research methodologies for ensuring that our projects and teaching practices are actually serving the learners’ needs. Again, these are informed by helpful case studies and showcase the many places and ways that we can collect data from and about our students throughout the teaching period and beyond.

It did make me wonder whether the research mindset of academics might conventionally be drawn from their discipline. Coming from an organisation with an education and social science orientation, one might expect an emphasis on the qualitative (and there are a lot of surveys suggested – which I wonder about as I have a feeling that students might be a little over-surveyed already) but the guide actually encourages a mixture of methodologies and makes a number of suggestions for merging data, as well as deciding how much is enough.

Definitely some great work from our colleagues across the ditch and well worth checking out.

(* The team is me – but one day…)

More thoughts on: “Digital is not the future – Hacking the institution from the inside” – Technology, practical solutions and further questions

Previously on Screenface.net:

I’ve been participating in an online “hack” looking at “Digital is not the future – Hacking the institution from the inside” with a number of other education designers/technologists.

hack-poster-3
It’s been pretty great.

I shared some thoughts and summarised some of the discussions tied to the issues we face in supporting and driving institutional change, working with organisational culture and our role as professional staff experts in education design and technology.

There’s still much to talk about. Technology and what we need it to do, practical solutions both in place and under consideration / on the wishlist, further questions and a few stray ideas that were generated along the way.

Technology: 

Unsurprisingly, technology was a significant part of our conversation about what we can do in the education support/design/tech realm to help shape the future of our institutions. The core ideas that came up included what we are using it for and how we sell and instill confidence in it in our clients – teachers, students and the executive.

The ubiquity and variety of educational technologies means that they can be employed in all areas of the teaching and learning experience. It’s not just being able to watch a recording of the lecture you missed or to take a formative online quiz; it’s signing up for a course, finding your way to class, joining a Spanish conversation group, checking for plagiarism, sharing notes, keeping an eye on at-risk students and so much more.

It’s a fine distinction but Ed Tech is bigger than just “teaching and learning” – it’s also about supporting the job of being a teacher or a learner. I pointed out that the recent “What works and why?” report from the OLT here in Australia gives a strong indication that the tools most highly valued by students are the ones that they can use to organise their studies.
Amber Thomas highlighted that “…better pedagogy isn’t the only quality driver. Students expect convenience and flexibility from their courses” and went on to state that “We need to use digital approaches to support extra-curricular opportunities and richer personal tracking. Our “TEL” tools can enable faster feedback loops and personalised notifications”

Even this is just the tip of the iceberg – it’s not just tools for replicating or improving analog practices – the technology that we support and the work we do offers opportunities for new practices. In some ways this links back closely to the other themes that have emerged – how we can shape the culture of the organisation and how we ensure that we are part of the conversation. A shift in pedagogical approaches and philosophies is a much larger thing that determining the best LMS to use. (But at its best, a shift to a new tool can be a great foot in the door to discussing new pedagogical approaches)

“It is reimagining the pedagogy and understanding the ‘new’ possibilities digital technologies offer to the learning experience where the core issue is” (Caroline Kuhn)

Lesley Gourlay made a compelling argument for us to not throw out the baby with the bathwater when it comes to technology by automatically assuming that tech is good and “analogue” practices are bad. (I’d like to assume that any decent Ed Designer/Tech knows this but it bears repeating and I’m sure we’ve all encountered “thought leaders” with this take on things).

“we can find ourselves collapsing into a form of ‘digital dualism’ which assumes a clear binary between digital and analogue / print-based practices (?)…I would argue there are two problems with this. First, that it suggests educational and social practice can be unproblematically categorised as one or the other of these, where from a sociomaterial perspective I would contend that the material / embodied, the print-based / verbal and the digital are in constant and complex interplay. Secondly, there perhaps is a related risk of falling into a ‘digital = student-centred, inherently better for all purposes’, versus ‘non-digital = retrograde, teacher-centred, indicative of resistance, in need of remediation’.” (Lesley Gourlay)

Another very common theme in the technology realm was the absolute importance of having reliable technology (as well as the right technology.)

Make technology not failing* a priority. All technology fails sometime, but it fails too often in HE institutions. Cash registers in supermarkets almost never fail, because that would be way too much of a risk.” (Sonia Grussendorf)

When it comes to how technology is selected for the institution, a number of people picked up on the the tension between having it selected centrally vs by lecturers.

“Decentralize – allow staff to make their own technology (software and hardware) choices” (Peter Bryant)

Infrastructure is also important in supporting technologies (Alex Chapman)

Personally I think that there must be a happy medium. There are a lot of practical reasons that major tools and systems need to be selected, implemented, managed and supported centrally – integration with other systems, economies of scale, security, user experience, accessibility etc. At the same time we also have to ensure that we are best meeting the needs of students and academics in a host of different disciplines. and are able to support innovation and agility. (When it comes to the selection of any tool I think that there still needs to be a process in place to ensure that the tool meets the needs identified – including those of various institutional stakeholders – and can be implemented and supported properly.)

Finally, Andrew Dixon framed his VC elevator pitch in terms of a list of clear goals describing the student experience with technology which I found to be an effective way of crafting a compelling narrative (or set of narratives) for a busy VC. Here are the first few:

 

  1. They will never lose wifi signal on campus – their wifi will roam seemlessly with them
  2. They will have digital access to lecture notes before the lectures, so that they can annotate them during the lecture.
  3. They will also write down the time at which difficult sub-topics are explained in the lecture so that they can listen again to the captured lecture and compare it with their notes. (Andrew Dixon)

Some practical solutions

Scattered liberally amongst the discussions were descriptions of practical measures that people and institutions are putting in place. I’ll largely let what people said stand on its own – in some cases I’ve added my thoughts in italics afterwards. (Some of the solutions I think were a little more tongue in cheek – part of the fun of the discussion – but I’ll leave it to you to determine which)

Culture / organisation

Our legal team is developing a risk matrix for IT/compliance issues (me)

(We should identify our work) “not just as teaching enhancement but as core digital service delivery” (Amber Thomas)

“we should pitch ‘exposure therapy’ – come up with a whole programme that immerses teaching staff in educational technology, deny them the choice of “I want to do it the old fashioned way” so that they will realise the potential that technologies can have…” (Sonja Grussendorf)

“Lets look at recommendations from all “strategy development” consultations, do a map of the recommendations and see which ones always surface and are never tackled properly.” (Sheila MacNeill)

“Could this vision be something like this: a serendipitous hub of local, participatory, and interdisciplinary teaching and learning, a place of on-going, life-long engagement, where teaching and learning is tailored and curated according to the needs of users, local AND global, actual AND virtual, all underscored by data and analytics?” (Rainer Usselman)

“…build digital spaces to expand our reach and change the physical set up of our learning spaces to empower use of technology…enable more collaborative activities between disciplines” (Silke Lange)

“we need a centralised unit to support the transition and the evolution and persistence of the digital practice – putting the frontliners into forefront of the decision making. This unit requires champions throughout the institutions so that this is truly a peer-led initiative, and a flow of new blood through secondments. A unit that is actively engaging with practitioners and the strategic level of the university” (Peter Bryant)

In terms of metrics – “shift the focus from measuring contact time to more diverse evaluations of student engagement and student experience” (Silke Lange)
“Is there a metric that measures teaching excellence?… Should it be designed in such a way as to minimise gaming? … should we design metrics that are helpful and allow tools to be developed that support teaching quality enhancement?” (David Kernohan)  How do we define or measure teaching excellence?
“the other thing that we need to emphasise about learning analytics is that if it produces actionable insights then the point is to act on the insights” (Amber Thomas) – this needs to be built into the plan for collecting and dealing with the data.

Talking about the NSS (National student survey) – “One approach is to build feel-good factor and explain use of NSS to students. Students need to be supported in order to provide qualitative feedback” (David Kernohan)  (I’d suggest that feedback from students can be helpful but it needs to be weighted – I’ve seen FB posts from students discussing spite ratings)

“We should use the same metrics that the NSS will use at a more granular levels at the university to allow a more agile intervention to address any issues and learn from best practices. We need to allow flexibility for people to make changes during the year based on previous NSS” (Peter Bryant)

“Institutional structures need to be agile enough to facilitate action in real time on insights gained from data” (Rainer Usselmann) – in real time? What kind of action? What kind of insights? Seems optimistic

“Institutions need at the very least pockets of innovation /labs / discursive skunk works that have licence to fail, where it is safe to fail” (Rainer Usselmann)

“Teachers need more space to innovate their pedagogy and fail in safety” (Silke Lange)
“Is it unfair (or even unethical) to not give students the best possible learning experience that we can?…even if it was a matter of a control group receiving business-as-usual teaching while a test group got the new-and-improved model, aren’t we underserving the control group?” (me)

“I can share two examples from my own experiences
An institution who wanted to shift all their UG programmes from 3 year to 4 year degrees and to deliver an American style degree experience (UniMelb in the mid 2000s)

An institution who wanted to ensure that all degree programmes delivered employability outcomes and graduate attributes at a teaching, learning and assessment level

So those resulted in;
a) curriculum change
b) teaching practice change
c) assessment change
d) marketing change ” (Peter Bryant)

“One practical option that I’m thinking about is adjusting the types of research that academics can be permitted to do in their career path to include research into their own teaching practices. Action research.” (Me) I flagged this with our Associate Dean Education yesterday and was very happy to hear that she is currently working on a paper for an education focussed journal in her discipline and sees great value in supporting this activity in the college.

“I think policy is but one of the pillars that can reinforce organisational behaviour” (Peter Bryant)- yes, part of a carrot/stick approach, and sometimes we do need the stick. Peter also mentions budgets and strategies, I’d wonder if they don’t change behaviour but more support change already embarked upon.

Technology

“let’s court rich people and get some endowments. We can name the service accordingly: “kingmoneybags.universityhandle.ac.uk”. We do it with buildings, why not with services?” (Sonia Grussendorf) – selling naming rights for TELT systems just like buildings – intriguing

We need solid processes for evaluating and implementing Ed Tech and new practices (me)

Pedagogical

“Could creating more ‘tailored’ learning experiences, which better fit the specific needs and learning styles of each individual learner be part of the new pedagogic paradigm?” (Rainer Usselman) (big question though around how this might be supported in terms of workload

“At Coventry, we may be piloting designing your own degree” (Sylvester Arnab)
“The challenge comes in designing the modules so as to minimise prerequisites, or make them explicit in certain recommended pathways” (Christopher Fryer)
I went on to suggest that digital badges and tools such as MyCourseMap might help to support this model. Sylvester noted that he is aware that “these learning experiences, paths, patterns, plans have to be validated somehow” Learner convenience over pedagogy – or is it part of pedagogy in line with adult learning principles of self-efficacy and motivation. In a design your own degree course, how do we ensure that learners don’t just choose the easiest subjects – how do we avoid the trap of having learners think they know enough to choose wisely?

“digital might be able to help with time-shifting slots to increase flexibility with more distributed collaboration, flipped teaching, online assessment” (George Roberts)

 

“At UCL we are in the midst of an institution-wide pedagogic redesign through the Connected Curriculum. This is our framework for research-based education which will see every student engaging in research and enquiry from the very start of their programme until they graduate (and beyond). More at http://www.ucl.ac.uk/teaching-learning/connected-curriculum

The connected bit involves students making connections with each other, with researchers, beyond modules and programmes, across years of study, across different disciplines, with alumni, employers, and showcase their work to the wider world…

There is strong top-down support, but also a middle-out approach with faculties having CC fellows on part time secondments to plan how introduce and embed the CC in their discipline.

From a TEL perspective we need to provide a digital infrastructure to support all of this connectivity – big project just getting going. Requirements gathering has been challenging… And we’re also running workshops to help programme and module teams to design curricula that support research-based and connected learning.” (Fiona Strawbridge) – liking this a lot, embedding practice. What relationship do these fellows have with lecturers?

 

“I am imagining that my research, personal learning environment would fit perfect with this approach as I am thinking the PLE as a toolbox to do research. There is also a potential there to engage student in open practice, etc.” Caroline Kuhn

“There may be a “metapedagogy” around the use of the VLE as a proxy for knowledge management systems in some broad fields of employment: consultancy, financial services, engineering…” (George Roberts)  (which I’d tie to employability)

“We need to challenge the traditional model of teaching, namely didactic delivery of knowledge. The ways in which our learning spaces are currently designed -neat rows, whiteboard at front, affords specific behaviours in staff and students. At the moment virtual learning spaces replicate existing practices, rather than enabling a transformative learning experience. The way forward is to encourage a curricula founded on enquiry-based learning that utilise the digital space as professional practitioners would be expected to” (Silke Lange) – maybe but none of this describes where or how lecturers learn these new teaching skills. Do we need to figure out an evolutionary timeline to get to this place, where every year or semester, lecturers have to take one further step, add one new practice?

“Do not impose a pedagogy. Get rid of the curricula. Empower students to explore and to interact with one another. The role of the teacher is as expert, navigator, orienteer, editor, curator and contextualisor of the subject. Use heuristic, problem-based learning that is open and collaborative. Teach students why they need to learn” (Christopher Fryer)

 

This is but a cherry-picked selection of the ideas and actions that people raised in this hack but I think it gives a sense of some of the common themes that emerged and of the passion that people feel for our work in supporting innovation and good practices in our institutions.  I jotted down a number of stray ideas for further action in my own workplace as well as broader areas to investigate in the pursuit of my own research.

As always, the biggest question for me is that of how we move the ideas from the screen into practice.

Further questions

How are we defining pedagogical improvements – is it just strictly about teaching and learning principles (i.e. cognition, transfer etc) or is it broader – is the act of being a learner/teacher a part of this (and thus the “job” of being these people which includes a broader suite of tools) (me)

What if we can show how learning design/UX principles lead to better written papers by academics? – more value to them (secondary benefits) (me)

“how much extra resource is required to make really good use of technology, and where do we expect that resource to come from?” (Andrew Dixon)

Where will I put external factors like the TEF / NSS into my research? Is it still part of the organisation/institution? Because there are factors outside the institution like this that need to be considered – govt initiatives / laws / ???

Are MOOCs for recruitment? Marketing? (MOOCeting?)

“How do we demonstrate what we do will position the organisation more effectively? How do we make sure we stay in the conversation and not be relegated to simply providing services aligned with other people’s strategies” (arguably the latter is part of our job)
“How do we embed technology and innovative pedagogical practices within the strategic plans and processes at our institutions?” (Peter Bryant)

Further research

Psychology of academia and relationships between academic and professional staff. (Executive tends to come from academia)

“A useful way to categorise IT is according to benefits realisation. For each service offered, a benefits map should articulate why we are providing the service and how it benefits the university.” (See https://en.wikipedia.org/wiki/Benefits_realisation_management ) (Andrew Dixon)

Leadership and getting things done / implementing change, organisational change

How is organisational (particularly university) culture defined, formed and shaped?

Actor-network theory

Design research

Some ideas this generated for me

Instead of tech tool based workshops – or in addition at least – perhaps some learning theme based seminars/debates (with mini-presentations). Assessment / Deeper learning / Activities / Reflection

Innovation – can be an off-putting / scary term for academics with little faith in their own skills but it’s the buzzword of the day for leadership. How can we address this conflict? How can we even define innovation within the college?

What if we bring academics into a teaching and learning / Ed tech/design support team?

Telling the story of what we need by describing what it looks like and how students/academics use it in scenario / case study format offers a more engaging narrative

What is the role of professional bodies (E.g. unions like the NTEU) in these discussions?

Are well-off, “prestigious” universities the best places to try to innovate? Is there less of a driving urge, no pressing threat to survival? Perhaps this isn’t the best way to frame it – a better question to ask might be – if we’re so great, what should other universities be learning from us to improve their own practices? (And then, would we want to share that knowledge with our competitors)

“I was thinking about the power that could lie behind a social bookmarking tool when doing a dissertation, not only to be able to store and clasify a resource but also to share it with a group of likeminded researcher and also to see what other have found about the same topic.” (Caroline Kuhn) – kind of like sharing annotated bibliographies?

Bigger push for constructive alignment
I need to talk more about teaching and learning concepts in the college to be seen as the person that knows about it

In conclusion

I’d really like to thank the organisers of the Digital is not the future Hack for their efforts in bringing this all together and all of the people that participated and shared so many wonderful and varied perspectives and ideas. Conversation is still happening over there from what I can see and it’s well worth taking a look.

 

 

 

 

 

 

 

 

 

 

 

 

Thoughts on: “What works and why?” OLT Project 2016

The Office for Learning and Teaching (OLT) is an Australian government body intended to support best practice in enhancing teaching and learning in the Higher Education sector.

It funds a number of research projects, which in 2013 included “What works and why? Understanding successful technology enhanced learning within institutional contexts” – driven by Monash University in Victoria and Griffith University in Queensland and led by Neil Selwyn.

The final report for this project has now been published.
They also have a project website running at http://bit.ly/TELwhatworksandwhy

Rather than focussing on the “state of the art”, the project focuses on the “state of the actual” – the current implementations of TELT practices in universities that are having some measure of success. It might not be the most inspiring list (more on that shortly) but it is valuable to have a snapshot of where we are, what educators and students value and the key issues that the executive face (or think they face) in pursuing further innovation.

The report identifies 13 conditions for successful use of Tech Enhanced Learning in the institution and with teachers and students. (Strictly speaking, they call it technology enabled learning, which grates with me far more than I might’ve expected – yes, it’s ultimately semantics but for me the implication is that the learning couldn’t occur without the tech and that seems untrue. So because this is my blog, I’m going to take the liberty of using enhanced)

ed tech conditions for success graphic

The authors took a measured approach to the research, beginning with a large scale survey of teacher and student attitudes toward TEL which offered a set of data that informed questions in a number of focus groups. This then helped to identify a set of 10 instances of “promising practice” at the two participating universities that were explored in case studies. The final phase involved interviewing senior management at the 39 Australian universities to get feedback on the practicality of implementing/realising the conditions of success.

So far, so good. The authors make the point that the majority of research in the TELT field relates to more cutting edge uses in relatively specific cohorts and while this can be enlightening and exciting, it can overlook the practical realities of implementing these at scale within a university learning ecosystem. As a learning technologist, this is where I live.

What did they discover?

The most prominent ways in which digital technologies were perceived as ‘working’ for students related to the logistics of university study. These practices and activities included:

  • Organising schedules and fulfilling course requirements;
  • Time management and time-saving; and
  • Being able to engage with university studies on a ‘remote’ and/or mobile basis

One of the most prominent learning-related practices directly to learning was using digital technologies to ‘research information’; ‘Reviewing, replaying and revising’ digital learning content (most notably accessing lecture materials and recordings) was also reported at relatively high levels.

Why technologies ‘work’ – staff perspectives

Most frequently nominated ways in which staff perceived digital technologies were ‘working’ related to the logistics of university teaching and learning. These included being able to co-ordinate students, resources and interactions in one centralised place. This reveals a frequently encountered ‘reality’ of digital technologies in this project: technologies are currently perceived by staff and students to have a large, if not primary role to enable the act of being a teacher or student, rather than enabling the learning.

Nevertheless, the staff survey did demonstrate that technologies were valued as a way to provide support learning, including delivering instructional content and information to students in accessible and differentiated forms. This was seen to support ‘visual’ learning, and benefit students who wanted to access content at different times and/or different places.

So in broad terms, I’d suggest that technology in higher ed is seen pretty much exactly the same way we treat most technology – it doesn’t change our lives so much as help us to live them.

To extrapolate from that then, when we do want to implement new tools and ways of learning and teaching with technology, it is vital to make it clear to students and teachers exactly how they will benefit from it as part of the process of getting them on board. We can mandate the use of tools and people will grumblingly accept it but it is only when they value it that they will use it willingly and look for ways to improve their activities (and the tool itself).

The next phase of the research, looking at identified examples of ‘promising practice” to develop the “conditions for success” is a logical progression but looking at some of the practices used, it feels like the project was aiming too low. (And I appreciate that it is a low-hanging-fruit / quick-wins kind of project and people in my sphere are by our natures more excited by the next big thing but all the same, if we’re going to be satisfied with the bare minimum, will that stunt our growth?) . In fairness, the report explicitly says “the cases were not chosen according to the most ‘interesting’, ‘innovative’ or ‘cutting-edge’ examples of technology use, but rather were chosen to demonstrate sustainable examples of TEL”

Some of the practices identified are things that I’ve gradually been pushing in my own university so naturally I think they’re top shelf innovations 🙂 – things like live polling in lectures, flipping the classroom, 3D printing and virtual simulations. Others however included the use of online forums, providing videos as supplementary material and using “online learning tools” – aka an LMS. For the final three, I’m not sure how they aren’t just considering standard parts of teaching and learning rather than something promising. (But really, it’s a small quibble I guess and I’ll move on)

The third stage asked senior management to rank the usefulness of the conditions of success that were identified from the case studies and to comment on how soon their universities would likely be in a position to demonstrate them. The authors seemed surprised by some of the responses – notably to the resistance to the idea of taking “permissive approaches to configuring systems and choosing software”. As someone “on the ground” that bumps into these kinds of questions on daily basis, this is where it became clear to me that the researchers have still been looking at this issue from a distance and with a slightly more theoretical mindset. There is no clear indication anywhere in this paper that they discussed this research with professional staff (i.e. education designers or learning technologists) who are often at the nexus of all of these kinds of issues. Trying to filter out my ‘professional hurt feelings’, it still seems a lot like a missed opportunity.

No, wait, I did just notice in the recommendations that “central university agencies” could take more responsibility for encouraging a more positive culture related to TEL among teachers.

Yup.

Moving on, I scribbled a bunch of random notes and thoughts over this report as I read it (active reading) and I might just share these in no particular order.

  • Educators is a good word. (I’m currently struggling with teachers vs lecturers vs academics)
  • How do we define how technologies are being used “successfully and effectively”?
  • Ed Tech largely being used to enrich rather than change
  • Condition of success 7″the uses of digital technology fit with familiar ways of teaching” – scaffolded teaching
  • condition of success 10 “educators create digital content fit for different modes of consumption” – great but it’s still an extra workload and skill set
  • dominant institutional concerns include “satisfying a perceived need for innovation that precludes more obvious or familiar ways of engaging in TEL” – no idea how we get around the need for ‘visionaries’ at the top of the tree to have big announceables that seemingly come from nowhere. Give me a good listener any day.
  • for learners to succeed with ed tech they need better digital skills (anyone who mentions digital natives automatically loses 10 points) – how do we embed this? What are the rates of voluntary uptake of existing study skills training?
  • We need to normalise new practices but innovators/early adopters should still be rewarded and recognised
  • it’s funny how quickly ed tech papers date – excitement about podcasts (which still have a place) makes this feel ancient
  • How can we best sell new practices and ideas to academics and executive? Showcases or 5 min, magazine show style video clips (like Beyond 2000 – oh I’m so old)
  • Stats about which tools students find useful – data is frustratingly simple. Highest rating tool is “supplementing lectures, tutorials, practicals and labs” with “additional resources” at 42% (So 58% don’t find useful? – hardly a ringing endorsement
  • Tools that students were polled about were all online tools – except e-books. Where do offline tools sit?
  • Why are students so much more comfortable using Facebook for communication and collaboration than the LMS?
  • 60% of students still using shared/provided computers over BYOD. (Be interesting to see what the figure is now)
  • Promising practice – “Illustrating the problem: digital annotation  tools in large classes” – vs writing on the board?
  • conditions for success don’t acknowledge policy or legal compliance issues (privacy, IP and copyright)
  • conditions for success assume students are digitally literate
  • there’s nothing in here about training
  • unis are ok with failure in research but not teaching
  • calling practices “innovations signals them as non-standard or exceptions” – good point. Easier to ignore them
  • nothing in here about whether technology is fit for purpose

Ultimately I got a lot out of this report and will use it to spark further discussion in my own work. I think there are definitely gaps and this is great for me because it offers some direction for my own research – most particularly in the role of educational support staff and factors beyond the institution/educator/student that come into play.

Update: 18/4/16 – Dr Michael Henderson of Monash got in touch to thank me for the in-depth look at the report and to also clarify that “we did indeed interview and survey teaching staff and professional staff, including faculty based and central educational / instructional designers”

Which kind of makes sense in a study of this scale – certainly easy enough to pare back elements when you’re trying to create a compelling narrative in a final report I’m sure.

Random PhD tips – the early years

One of the things that I’m finding with this study is that there is a wealth of advice out there and it still feels vaguely productive to pore over that instead of pressing on with actual “work”.

Here are some scattered tips and things that I’ve read and been told in no particular order.

The PhD is essentially a research apprenticeship. While one of the stated goals is to make a contribution to the scholarship, a large part of it is about demonstrating that you are able to competently carry out research and use it to build a solid argument. It doesn’t have to be massive or incredibly sophisticated – indeed, there’s a respect paper (yet to read but it’s on the list) titled “It’s a PhD, not a Nobel Prize” by Mullins and Kiley (2002).

In the early stage of your research, it’s all about the reading. You don’t know what you don’t know yet or where the literature will take you so it’s just about reading, reading and more reading. Everyone has said to enjoy this phase because you generally never get to be this indulgent again. (I’m happy with this advice but I also want to make sure that I make the most of the reading that I do so I’ve been fiddling around the edges looking for ways to capture the information, quotes, ideas and further reading to be found in it – partially in a bibliographic/citation tool (Zotero) and partially in blogging about it)

How to read is also a thing. Given the ridiculous amount of material out there, reading cover to cover isn’t going to get it done. I’ve repeated had it suggested to skim the abstract, the introduction and the conclusion, references and if this seems relevant, dip into the methodology. Then decide whether to proceed. This part I’m finding harder, as I’m yet to feel confident that I can make this judgement but I’m sure it will come with time. It seems sensible to put a little more trust into readings recommended by my supervisors and clever colleagues and so far, so good.

Building some solid organisational systems is a no brainer. This is something that I particularly enjoy – probably more than the work part that actually comes after it. Working out categories and folders and backups is great but I’ll have to press on pretty soon.

Emotional resilience is a pretty common theme in the advice literature. It’s a long, draining process requiring us to put our ideas and intellect on display to the world and there will inevitably be some tough feedback. There’s also a lot of talk about imposter syndrome – the fear that people will realise that we aren’t are smart as we make out. (I initially typed smark there, so I’m not sure what that says). This seems reasonably healthy to me – kind of the inverse of the Dunning Kruger effect.

I’ve already felt fairly conscious of the fact that many of my peers seem to be using more theoretical (dare I say jargony) terms in their writing and I wonder if I am judged for not doing this. (I have a political thing about “plain English” and accessible language but I do understand that there are some terms and concepts that are far more effectively communicated in “jargon”). At the moment, I feel that the best thing for me to do is to use the language that best enables me to share my thoughts. (I have found a glossary tool that I’ve added to this blog and will be populating at some point that adds mouse-over definitions for terms in the text. One of my favourite things about reading on the Kindle is being able to highlight words for an instant defintiion. The Trowler book that I’ve been banging on about has a nice linked glossary section at the back, that has helped me to get across concepts like ontology and epistemology, endogenous and etic, among others)

Communicating early and often is also a key theme in the advice so far – as a fairly self-reliant person this is certainly something that I’ll need to work on but I definitely see the value. Many hands etc etc. In the past I’ve liked/needed to thrash an idea out in my head to come up with some kind of solution that I was willing to share with others but I just don’t think this is going to cut it this time around.

Being open to ideas from seemingly unrelated areas is another great piece of advice. A lot of these have come from Inger “Thesis Whisperer” Mewburn’s blog and “How to Tame your PhD” book, I must add. Going to talks by other researchers might lead to any number of brain waves as you extrapolate their ideas to your context.

I’m not super close to the writing stage yet but like the idea of setting time limits to get certain things done. Avoiding that whole “work expands to fill the time available to it” trap. I’ve long been an advocate of Pomodoro technique (the hardest part is starting the timer) and I understand the notion of “the perfect is the enemy of the good/done”. All of this writing will go through umpteen drafts, so it doesn’t have to be good the first (or the fifth) time, it just has to be written.

Talking to people about what you’re doing and what you’re planning (particularly supervisors) can be an effective way to add some deadline and social pressure to do the work. (I’m still trying to work out exactly what I need to be doing yet – it seems like just reading is too easy)

There are a lot of people around that want to help us to succeed and are willing to help – there is research training, library skills training, communities of practice and reams of published advice. For all of this, I am particularly grateful. Thanks.

Thoughts on: “Doing Insider Research in Universities” (Trowler, 2012) Part 4 – Social practice theory

Ok, so one more post and I can put this book (Is it still a book on a Kindle?) down.

Social Practice theory is a direction that I’ve been encouraged to explore by my supervisor and I can see that if offers some promise if I do end up travelling down “how things work in organisations” type path.

I have to be honest, I understand that theory is important in underpinning the shape of one’s research and in being able to make a contribution to the scholarly world but what I’ve seen of SPT so far feels a little simplistic. Again, this is kind of the point of theory in that “making the familiar strange” type way and I have no doubt that the further into it I go, the more complex it will seem. On the plus side, there’s nothing in it yet that I strongly disagree with.

To paraphrase (horribly) my current understanding – the things that people do are influenced by their contexts and particularly the physical/material aspects of the things that they interact with. (The actions also influence the form that the material things take). People have a set of things that they do regularly and routinely. The contexts and external actors on these practices are important and should not be overlooked – these include the locations and groups where the actions take place.

Some interesting quotes from Trowler:

Practices have an evolving trajectory, rarely a revolutionary one

The process of context generation… is a very significant factor in changing organisations

Change initiatives work best when they are grounded in current sets of practices and build on them

(That has been a fairly common constant in much of what I have been reading of late – we want to scaffold “innovation” where possible. That and benchmarking really)

Trowler identifies what he calls eight “moments” in teaching and learning regimes (TLRs):

 which shape contextual concerns and which depict the social practices in place.
1. Recurrent practices
2. Tacit assumptions
3. Implicit theories of teaching and learning
4. Discursive repertoires
5. Conventions of appropriateness
6. Power relations
7. Subjectivities in interaction
8. Codes of signification

I’m not yet sure how I’ll start to unpack these but they appear to merit further consideration.

He wraps the book up with a discussion of further resources and some of the gaps in the current literature. He notes that there has been much less research looking into the organisational and management sides of higher education than other areas.

He goes on to examine some of the more common methodologies used in insider research in higher education, which gives me a few more leads to follow.

Common research methods or methodologies used in higher education research generally are: documentary analysis; comparative analysis; interviews; surveys; multivariate analyses; conceptual analyses, phenomenography; critical/feminist perspectives; and auto/biographical and observational studies

As an entry-way to this particular type of research, I found this book invaluable to setting the scene, systematically laying out the pros and cons of a range of approaches and providing some theoretical frameworks to wrap it up in. In fairness, I’m still at the stage where I don’t entirely know what I don’t know but there is an evenhandedness and accessibility to his writing that inspires some confidence.

I feel relatively confident that, given enough thought, it will be possible to conduct some interesting and worthwhile research in my institution.

 

Thoughts on: “Doing Insider Research in Universities” (Trowler, 2012) Part 3 – Good research design and ethics/politics

I’m not sure that this is how I’ll process all of the books that I read – in fact I’m almost certain that it isn’t – but I’ll continue this series of posts about Trowler – Doing Insider Research in Universities because I have found it to be a great way to dip my toe into the many issues that I will face in my research.

The next two chapters look at value and robustness in insider research (which, again, I take to be about being able to defend your methodology – and choosing a good one) and then the ethics and politics of insider research in your university, which is pretty much unavoidable.

He opens with a discussion of some of the criticisms that case studies face as well as some of the responses to these. I’ll leave it here in full as it sums it up well.

Case study researchers may find their work subject to the following criticisms (Flyvbjerg, 2006): it only yields concrete, practical knowledge rather than the supposedly more valuable general, theoretical and propositional knowledge;  generalization from one case is not possible and so the research does not contribute to the development of knowledge; the research can generate hypotheses, but other methods are necessary to test them and build theory; case study design contains a bias toward verification, i.e., a tendency to confirm the researcher’s preconceived notions. Both Flyvbjerg and Yin (2009) refute these criticisms, but those who research their own universities need to be clear about precisely what their research questions are, what the rationale behind the research design is, and what the truth claims are. This advice holds for any kind of research, but other designs tend to draw less critical fire.

(He also highlights Gomm et al (2000), Simons (2009) and Yin (2009) as great starting points for further investigation of case study methodology)

Trowler dips back into some of the ontological questions that he touched on earlier in the book, comparing the merits of the true vs the useful. (I may be oversimplifying this). This draws on Sayer’s notion of “practical adequacy” in prioritising the usefulness of information. I kind of get it but think I’ll need to dig a little deeper. I can see how some true things mightn’t necessarily be valuable but as for things that are kind of true…?

This is echoed in further discussions of Bassey’s idea of “fuzzy generalizations”. In short, this is about acknowledging that life is complex and theory won’t always accommodate the range of factors at play. So that rather than saying that in situation A, if B happens then C will follow, we might say in situation A, if B happens then C will generally follow between D and E% of the time. It’s not as neat and arguably not as helpful but no doubt more realistic.

In terms of the design of research, Trowler posits twelve questions for researchers to consider to test the rigour and quality of their proposed methods.

1. In designing the research, how do I know that my planned approach will answer the questions I have more fully than any other?
2. How do I design the research to take best advantage of the benefits of insider research while avoiding its pitfalls as far as possible?
3. Conceptually, how do I represent my organization, its culture and its practices? (And how does this representation shape my design?)
4. How and from whom will I secure access to the data I need? (Why them and why not others?)
5. Whom should I inform about the project, and how should I describe it, when I seek ‘informed’ consent? (And how might this affect my data?)
6. How will I ensure that the project is run ethically so that all participants and institutional bodies are protected? (While at the same time being as transparent as possible to readers so they can judge the robustness of my approach and conclusions?)
7. If I am using participant observation, what are the ethical issues in relation to the people I observe in natural settings? (And how might my answer to that question affect my data?)
8. If using interviews, what measures should I take to deal with interview bias? (And will they assure a sufficient degree of robustness?)
9. What should the balance be between collecting naturalistic data and formally ‘collected’ data? (And how can I offer assurances of robustness about conclusions drawn from both?)
10. How should I analyse the different forms of data I have, given that there will almost certainly be a large amount of various sorts? (And how do I ensure that sufficient and appropriate weight is given to each form of data in generating conclusions?)
11. How, and how much, will I communicate my findings to participants to ensure that they are happy with what I intend to make public? (And will this affect the way I present my conclusions to other audiences?)
12. Generally, in what other ways can I satisfy the reader about the robustness of my research and its findings?

(I was a little hesitant to just paste this in holus bolus but it all seems particularly valuable).

In very pragmatic terms, there will also inevitably be a number of ethical and political issues to consider when undertaking insider research in one’s own institution. The question of whether to anonymise the institution and the research participants is a live one – though at this stage, to me, it seems impractical and counterproductive, particularly when it could mean reducing the number of sources of data for fear of not being able to correctly reference them. The lack of transparency could also arguably lessen a reader’s view of the robustness of the research.

I also think that the question of to what extent people change their behaviour under observation is a valid one, however there are no doubt ways to mitigate this.

Politically, senior leaders in the organisation will imaginably want to feel confident that the research won’t damage the reputation of the university before granting access. Does this then lead to self-censorship and selective reporting if there are areas where there is room for improvement?

At a higher level of ethical debate, the selection of standpoints from which to pose questions and to begin observations and investigations can raise some concerns. If I fail to incorporate the perspective of voices that are less often heard, am I guilty of perpetuating the status quo?

Lots to think about and to be perfectly blunt, I would be naive not to factor in the question of what asking the wrong person the wrong question might mean for my career prospects in the organisation. Fortunately I have a little time to think about some of these things.