Paper accepted to ASCILITE’2014 by Colin Beer, David Jones and Rolley Tickner.
Abstract
There is growing rhetoric about the potential of learning analytics in higher education. There is concern about what the growing hype around learning analytics will mean for the reality. Will learning analytics be a repeat of past mistakes where technology implementations fail to move beyond a transitory fad and provide meaningful and sustained contributions to learning and teaching? How can such a fate be avoided? This paper identifies three paths that learning analytics implementations might take, with particular consideration to their likely impact on learning and teaching. An ongoing learning analytics project – currently used by hundreds of teaching staff to support early interventions to improve student retention -at a regional Australian university is examined in relation to the three paths, and some implications, challenges and future directions are discussed.
Keywords: learning analytics, learning, teaching, data, complexity, bricolage
Introduction
The delivery of distance education via the Internet is the fastest growing segment of adult education (Carr-Chellman, 2004; Macfadyen & Dawson, 2010) and there is considerable pressure for institutions to ‘join the herd’. Burgeoning demand for university places, increased competition between universities, the introduction of globalisation coupled with reduced public funding are driving universities to expend time and resources on e-learning (Ellis, Jarkey, Mahony, Peat, & Sheely, 2007). There is however, evidence to suggest that the ubiquitous adoption of learning management systems (LMS) to meet institutional e-learning needs, has constrained innovation and negatively impacted on the quality of the learning experience (Alexander, 2001; Paulsen, 2002). This has contributed to a gap between the rhetoric around the virtues of e-learning and the complicated reality of the e-learning ‘lived experience’. Increasingly the adoption of technology by universities is being driven by a search for any panacea that will bridge this gap and is showing a tendency toward faddism.
Managerial faddism or hype is the tendency of people to eagerly embrace the newest fad or technology of the moment and to see problems as being largely solvable (or preventable) through better or more ‘rational’ management (Goldfinch, 2007). Birnbaum (2001) says about managerial fads; “they are usually borrowed from other settings, applied without full consideration of their limitations, presented either as complex or deceptively simple, rely on jargon, and emphasize rational decision making” (p. 2). Maddux and Cummings (2004) suggest that the use of information technology in higher education has been "plagued by fad and fashion since its inception" (p. 514). It is argued that management hype cycles are propagated by top-down, teleological approaches that dominate technology innovation, and indeed management, in higher education (Duke, 2001). Given the higher education sector’s disposition to adopting technological concepts based on hype and apparent rationality (Duke, 2001), there is a danger that the implementation of emerging technology related concepts, such as learning analytics (LA), will fail to make sustained and meaningful contributions to learning and teaching (L&T).
The aim of this paper is to explore how LA can avoid becoming yet another fad, by analysing the likely implementation paths institutions might take. The paper starts by examining what we now know about LA for evidence that suggests LA appears to be in the midst of a hype cycle that is likely to impede its ability to provide a sustained and meaningful contribution to L&T. The paper then examines some conceptual and theoretical frameworks around hype cycles, technology implementation, complex systems and models of university learning. These frameworks form the basis for identifying and analysing three likely paths universities might take when implementing LA. CQUniversity’s recent experience with a LA project that aims to assist with student retention is drawn upon to compare and contrast these paths before implications and future work are presented.
What we know about learning analytics
Johnson et al. (2013) define Learning Analytics (LA) as the collection and analysis of data in education settings in order to inform decision-making and improve L&T. Siemens and Long (2011) define LA as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”(p. 34). Others have said, “learning analytics provides data related to students’ interactions with learning materials which can inform pedagogically sound decisions about learning design” (Fisher, Whale, & Valenzuela, 2012, p. 9). Definitions aside, it can be said that the widespread use of technology in higher education has allowed the capture of detailed data on events that occur within learning environments (Pardo, 2013). The explosion in the use of digital technologies with L&T has contributed to the sectors’ growing interest in LA due to the ability of technology to generate digital trails (Siemens, 2013a), which can be captured and analysed. These digital trails have the potential to inform L&T practices in a variety of ways.
It is said that LA can contribute to course design, student success, faculty development, predictive modelling and strategic information (Diaz & Brown, 2012). Others say that LA can identify students at risk, highlight student learning needs, aid in reflective practice and can enable teachers to appropriately tailor instruction among other things (Johnson et al., 2013). Reports abound identifying LA as a key future trend in L&T with many reporting its rise to mainstream practice in the near future (Johnson et al., 2013; Lodge & Lewis, 2012; New Media Consortium, 2012; Siemens, 2011). Siemens and Long (2011) typify this rhetoric when they say that LA “is essential for penetrating the fog that has settled over much of higher education” (p. 40). While the promise of LA is still has a long way to go to live up to expectation, the prospect of evidence-informed practice and improved data services in an increasingly competitive and online higher education marketplace, is fuelling institutions’ interest in LA.
There are many reasons for the normative pull towards improved data services in higher education. Administrators demand data for the support of resource and strategic planning, faculty and administrators are hungry for information that can assist institutions with student recruitment and student retention, and external agencies such as governments, require a range of data indicators about institutional performance (Guan, Nunez, & Welsh, 2002). Prior to the emergence of LA, this desire for improved data services had, in many cases, led to the adoption of data warehouses by universities. Data warehouses are “… an subject-oriented, integrated, non-volatile and time-variant collection of data in support of management’s decisions” (Inmon, 2002). Data warehouses grew out of decision support systems and their use has escalated over recent years with increasing volumes and varieties of data being collected by institutions (Guan et al., 2002). Unfortunately and despite large volumes of data, data warehouses suffer from high failure rates and limited use by users (Goldfinch, 2007).
It has been said that a majority of information systems (IS) fail, and the larger the development, the more likely it will fail (Goldfinch, 2007). While there are many potential reasons for IS project failure, managerial faddism, management approaches and immense complexity are shown to be a significant factors (Goldfinch, 2007). These factors are of particular concern for LA, due to a range of underlying complexities and the ‘contextuality’ of what LA is representing (Beer, Jones, & Clark, 2012). Managerial faddism and management approaches to technology adoption can constrain the implementation’s ability to deal with complexity, as solutions are often presented as universally applicable, ‘quick fixes’ (Birnbaum, 2001). This is a concern for LA, as there is evidence to suggest that it is currently in the midst of a hype cycle.
The learning analytics hype
In observing the growing interest and attempted implementations of learning analytics within Australian universities, it is increasingly apparent that learning analytics is showing all the hallmarks of a management fashion or fad (Jones, Beer, & Clark, 2013). Fads are innovations that appear to be rational and functional, and are aimed at encouraging better institutional performance (Gibson & Tesone, 2001). Environmental factors such as increasing competition, regulation and turbulence contribute to the development of fads where there is an overwhelming desire to ‘be part of the in crowd’ (Gibson & Tesone, 2001). Fads often ‘speak to managers’ in that they appear to be common-sense and appeal to organisational rationality around efficiency and effectiveness, which makes counter-argument difficult (Birnbaum, 2001). Learning analytics talks strongly to managerialism due to its potential to facilitate data-driven decision-making and to complement existing institutional business intelligence efforts (Beer et al., 2012). Once an innovation such as LA achieves a high public profile, it can create an urgency to ‘join the bandwagon’ that swamps deliberative, mindful behaviour (Swanson & Ramiller, 2004).
The Horizon Project is an on-going collaborative research effort between the New Media Consortium and various partners to produce annual reports intended to help inform education leaders about significant developments in technology in higher education. LA has been mentioned in the Horizon Project’s reports in some form for the last five years. In 2010 and 2011 reports, visual data analysis (Johnson, Levine, Smith, & Stone, 2010) and then learning analytics (Johnson et al., 2011), were placed in the four to five year time frame for widespread adoption. In 2012 and 2013, perhaps as a sign of developing hype, LA moved to ‘1 year or less until widespread adoption’. However, in a 2014 report (Johnson, Adams Becker, Cummins, & Estrada, 2014), predictions about the widespread adoption of learning analytics has moved back to the 2 to 3 year time frame. Johnson et al (2014) explain that this increase in time frame is said not to be because “learning analytics has slowed in Australian tertiary education” (p. 2), but instead due to new aspects of learning analytics that add “more complexity to the topic that will require more time to explore and implement at scale (p. 2). Could this perhaps echo Birnbaum’s (2001) earlier observation that fads are often presented as complex or deceptively simple? During a trip to Australia in 2013, George Siemens, a noted international scholar in the LA arena, said “I’m not familiar with (m)any universities that have taken a systems-level view of LA… Most of what I’ve encountered to date is specific research projects or small deployments of LA. I have yet to see a systemic approach to analytics use/adoption.” (Siemens, 2013b).
The gathering hype around LA (Jones et al., 2013) appears to be following a similar trend to the business world around the concept of “big data” – the analysis and use of large datasets in business. Universities also followed the business world with the widespread adoption of data warehouse technology for decision support (Ramamurthy, Sen, & Sinha, 2008). While data warehouses have been around for some time, they have been plagued by high failure rates and limited spread or use (Ramamurthy et al., 2008). This is indicative of a larger trend in industry, where “the vast majority of big data and magical business analytics project fail. Not in a great big system-won’t-work way…They fail because the users don’t use them” (Schiller, 2012). The adoption of these technologies appears to be perilous even when the rush to adoption is not being driven by hype. If learning analytics does appear to be showing all the signs of being yet another fad, what steps can organisations take to avoid this outcome? The following section describes some theoretical frameworks that are drawn upon to help identify potential paths.
Theoretical frameworks
Hype cycles characterise a technology’s typical progression from an emerging technology to either productive use or disappointment (Linden & Fenn, 2003). Hype cycles have been linked to a recognition that imitation is often the driving force behind the diffusion of any technological innovation (Ciborra, 1992). Birnbaum (2001) suggest that technology hype cycles start with a technological trigger, which is followed a growing revolution and the rapid expansion of narrative and positivity around the technology. Then comes the minimal impact where enthusiasm for the technology starts to wane and initial reports of success become tempered by countervailing reports of failure. This is followed by the resolution of dissonance where the original promoters of the fad seek to explain the failure of the fad to achieve widespread impact. Such explanations tend not to see the blame arising from the fad itself, but instead attribute it to "a lack of leadership, intransigence of followers, improper implementation, and lack of resources" (Birnbaum, 2001). Hype cycles are linked with teleological or top-down approaches to technology adoption, which have primacy in higher education (Birnbaum, 2001). A practice that seems ignorant of research suggesting ateleological or bottom-up approaches to technology adoption can lead to more meaningful implementations (Duke, 2001).
Defining what constitutes a successful implementation of an Information or Communication Technology (ICT) is perilous. The conventional approach to recognising a successful ICT project according to Marchland & Peppard (2013) relates to some easily answered questions. Does it work? Was it deployed on time? Was it within budget? Did it adhere to the project plan? Goldfinch (Goldfinch, 2007) extends this to say that ICT projects can often fail simply because they are not used as intended, or users do not use them at all for reasons such as recalcitrance, lack of training or usability. More traditional project success measures might be useful for straightforward ICT projects where the requirements can be determined at the design stage, however, ICT projects around data and analytics are much more difficult to evaluate in terms of success or failure (Marchand & Peppard, 2013). These systems require people to interpret and create meaning from the information the systems provide. While deploying analytical ICT is relatively easy, understanding how they might be used is much less clear and these projects cannot be mapped out in a neat fashion (Marchand & Peppard, 2013). Suggesting that traditional ‘top down’ approaches associated with technology implementation might be less than ideal for LA implementations.
Teleological, top-down or plan-based approaches dominate technology adoption in higher education (McConachie, Danaher, Luck, & Jones, 2005). Known as planning or plan-based approaches, they are typically idealistic, episodic and follow a deliberate plan or strategy (Boehm & Turner, 2003). The suitability of these approaches for resolving complex problems has been questioned (Camillus, 2008). By contrast, ateleological or learning approaches follow an emergent path and are naturalistic and evolutionary (Kurtz & Snowden, 2003). The debate between the planning and learning schools of process has been one of the most pervasive debates in the management literature (Clegg, 2002) with many authors critically evaluating the two schools (e.g., Mintzberg, 1989; Kurtz & Snowden, 2003; McConachie et al, 2005).
The use of planning-based processes to the implementation of LA projects creates a problem when online learning environments are acknowledged as non-linear complex systems (Barnett, 2000; Beer et al., 2012; Mason, 2008a, 2008b). Complex systems are systems that adapt, learn or change as they interact (Holland, 2006). They are non-linear systems in that they contain nested agents and systems that are all interacting and evolving, so we cannot understand any of the agents or systems without reference to the others (Plsek & Greenhalgh, 2001). Cause and effect is not evident and cannot be predicted, meaning that even small interventions can have far-reaching, disproportionate and impossible to predict consequences (Boustani et al., 2010; Shiell, Hawe, & Gold, 2008). If LA is about understanding learners and the contexts within which they learn, considering online learning environments as complex systems has a profound effect on how we approach LA projects. It follows from this that what contemporary universities need is the most productive elements of both teleological and ateleological approaches to the eight elements of the design process identified by (Introna, 1996). Such a synthesis is crucial to addressing the plethora of issues competing for the attention of university decision-makers, whether in Australia or internationally.
The development of LA tools and processes is only the first of the steps (Elias, 2011) identifies as necessary for the implementation of LA. The second step identified by (Elias, 2011), and arguably the far more difficult step, is "the integration of these tools and processes into the practice of teaching and learning" (p. 5). Beer et al. (2012) argue that it is likely to be the teachers who have the right mix of closeness and expertise with the learning context, to make the best use of LA derived information. Echoing earlier arguments that teachers are perhaps the most important element of any attempt to enhance learning and teaching (L&T) (Radloff, 2008). Achieving such a goal would appear to require some understanding of the practice of teaching and learning. One such understanding is provided by Trigwell’s (2001) model of university teaching. As shown in Figure 1, Trigwell’s (2001) model suggests that the student learning experience is directly related to teachers’ strategies, teachers’ planning, teachers’ thinking including knowledge, conceptions and reflections, along with the L&T context. This is difficult as the teacher’s context is complex and dynamic. If LA is representing data about learners and their contexts and its goal is to enhance to L&T, it is crucial that it engages with teachers and their dynamic contexts (Sharples et al., 2013).
Figure 1. Trigwell’s (2001) model of university teaching.
The three paths
Based on the preceding theoretical perspectives and personal experience within Australian universities, it is possible to identify at least three potential paths – ‘do it to’, ‘do it for’, and ‘do it with’ -that universities might take when pondering harnessing LA. In the rest of the paper we describe these three paths and then use them to understand the emergence of an LA project at a particular Australian University.
Do it to the teachers
‘Do it to’ describes the top-down, techno-rational and typical approach to ICT adoption in higher education. In theory, this approach starts with the recognition that LA aligns with identified institutional strategic goals. From there a project is formed that leads to a technology being identified and considered at the institutional level, usually with input from a small group of people, before being implemented institution-wide. ‘Do it to’ approaches will typically involve the setting up of a formal project with appropriate management sponsorship, performance indicators, budgets, project teams, user groups, and other project management requirements.
The ‘do it to’ approach focuses much of its attention on changing the teaching and learning context (the left hand end of Figure 1) in terms of policies and systems. The assumption is that this change in context will lead to changes in teacher thinking, planning and strategy. ‘Do it to’ provides a focus on creating the right context for L&T but its effect on teacher thinking, planning and strategy is arguably deficient. ‘Do it to’ represents a mechanistic approach that although common, is likely to fail (Duke, 2001) and this is particularly troublesome for LA implementations for a range of reasons.
The difficulty of ICT implementation for data and analytics projects (Marchand & Peppard, 2013) is compounded in LA due to its novelty and an absence of predefined approaches that are known to work across multiple contexts (Siemens, 2013a). L&T contexts are complex and diverse (Beer et al., 2012) and imposed technological solutions into these environments can lead to a problem of task corruption, where staff engagement is superficial and disingenuous (Rosh White, 2006). Centralised approaches to LA can often be mistakenly viewed as purely an exercise in technology (Macfadyen & Dawson, 2012) and may provide correlative data that can be misleading or erroneous at the course or individual levels (Beer et al., 2012).
Do it for the teachers
Geoghegan (1994) identifies the growth of a "technologists’ alliance" between innovative teaching staff, central instructional technology staff and information technology vendors, as responsible for many of the developments that seek to harness information technology to enhance student learning. While this alliance is often called upon to contribute to the "do it to" path, they are also largely responsible for the "do it for" path. Driven by a desire and responsibility to enhance student learning, members of the alliance will seek to draw on their more advanced knowledge of technology and its application to learning and teaching to: organize staff development sessions; experiment with, adopt or develop new applications of technology; and, help with the design and implementation of exemplar information technology enhanced learning designs. Such work may lead to changes in the L&T context – in much the same way as the "do it to" path -through the availability of a new Moodle plugin for learning analytics or visits from experts on certain topics. It can also lead to changes in the thinking, planning and strategies of small numbers of teaching staff. Typically those innovative teaching staff participating in the exemplar applications of technologies and whom are becoming or already a part of the technologists’ alliance.
While the technologists’ alliance is responsible for many of the positive examples of harnessing information technology to enhance L&T, Geoghagen (1994) also argues that its members have "also unknowingly worked to prevent the dissemination of these benefits into the much larger mainstream population". Geoghagen (1994) attributes a major contributor to this being the extensive differences between the members of the technologists’ alliance and the majority of teaching staff. Rather than recognize and respond to this chasm, there has been a failure to recognize its existence, assume a level of homogeneity, and believe that it is simply a matter of overcoming increased resistance to change, rather than addressing qualitatively distinct perspectives and needs (Geoghegan, 1994).
Do it with the teachers
This approach is firmly entrenched in the learning approach process mentioned previously. This path starts by working with teaching academics inside the course or unit ‘black box’ during the term. The idea is to develop an understanding of the lived experience, encompassing all its diversity and difficulty, so as to establish how LA can help contribute within the context. The aim being to fulfil Geoghagen’s (1994) advice to develop an application “well-tuned to the instructional needs” that provides a “major and clearly recognizable benefit or improvement to an important process”. Such applications provide those outside the techologists’ alliance a compelling reason to adopt a new practice. It is through the adoption of new practices that educators can gain new experiences “that enable appropriation of new modes of teaching and learning that enable them to reconsider and restructure their thinking and practice” (Cavallo, 2004). An approach, which (Cavallo, 2004) argues is significantly more effective in changing practice than “merely being told what to do differently” (p. 97). Thus the ‘do it with’ path starts with the current reality of L&T – the right-hand end of Figure 1 – and works back toward the left.
Beyond being a potentially more effective way of changing thinking and practice around learning, the "do it with" approach brings a number of other potential benefits. This type of bottom-up or evolutionary approach is also known as bricolage – "the art of creating with what is at hand" (Scribner, 2005) -and has been identified as a major component of how teachers operate (Scribner, 2005). It is also a primary source of strategic benefit from information technology (Ciborra, 1992). However, the ‘do it with’ path also has some hurdles to overcome. These approaches are messy and tend not to fit with existing institutional approaches to technology adoption or innovation. Learning approach processes are agile and require freedom to adapt and change. This clashes with existing organisational cultural norms around technology innovation, implementation and uniformity. ‘Do it with’ approaches do not fit with existing organisation structures that are rationally decomposed into specialised units (Beer et al., 2012). Other problems can be attributed to workloads and competing requirements and these can inhibit the collaborative, reflective and adaptable approaches required for bricolage. There are also questions about whether or not such approaches can be translated into sustainable, long-term practices.
A question of balance
These three approaches described above are not mutually exclusive. Elements of all three approaches are very likely to, and perhaps need to exist with in LA implementations. It is a question of balance. The typical approach to ICT implementation is ‘do it to’ which constrains the impact the implementation might have on L&T. This paper has suggested that ‘do it with’ and even ‘do it for’ approaches, may allow LA to develop more sustained and meaningful contributions to L&T. However, they starkly contrast with existing institutional technology adoption and implementation norms based on ‘do it to’. While the way forward may not be clear, it is clear that we need a better balance between all three of these approaches if LA is going to enhance learning, teaching and student success. The following section describes a LA implementation at a regional Australian university with a very complex and diverse L&T environment.
EASI @ CQU
EASI or Early Alert Student Indicators is a LA project at CQUniversity targeting a strategic goal around student retention by improving academic-student contact. It combines student descriptive data from the student information system with student behaviour data from the Moodle LMS, and provides this data, in real-time, to teaching academics within their Moodle course sites. It also provides the academics a number of ways by which they can ‘nudge’ students who might be struggling at any point during the term. The term 1, 2014 trial was deemed to be very successful with 5,968 visits to EASI across the term, by 283 individual academic staff that looked at 357 individual courses. A majority of the 39,837 nudges recorded were mail-merges where academics used the in-built mail-merge facility to send personalised emails to students. The 7,146 students who received at least one ‘nudge’ email during the term, had by the end of term, 51% more Moodle clicks on average than students who did not receive nudges. This may be indicative of heightened engagement and aligns with anecdotal comments from the academics who have indicated that the personalised email ‘nudges’ promoted increased student activity and dramatically elevated staff-student conversation.
Based on a strategic goal to address a growing student retention problem, a formal project was proposed in 2012 based on a project proposal document (PPD) that outlined how the project would contribute to the strategic goal. There were more than a dozen iterations of this document before the project gained final approval, which then required a project initiation document (PID) to be submitted. The PID, over a number of iterations, provided fine-grained detail on a range of plans for the project including the project plan, project scope, deliverables, milestones, budget and quality. Twelve months after the PPD, work officially began on the project following the final approval of the PID. On the surface it would appear that this particular LA project followed a ‘do it to’ approach with formal project management methodology, and early indications about its success are encouraging. However, the underlying and invisible reality suggests a different story.
The idea for EASI evolved from many conversations and collaborations between staff from within the central L&T unit, and coalface academic staff, going back to 2008. These conversations and collaborations were predominately around finding ways of making better use of data that could inform L&T. The central L&T staff were somewhat unique in that they were active LA researchers, possessed experience with software development, and all were in daily contact and shared insights with front-line academic teaching staff. The central L&T staff pursued LA in their own time, using informal access to test data that was often incomplete or inconsistent. The EASI concept developed during 2011, when these staff identified the potential for LA to contribute to the strategic imperative of improving student retention. A number of small-scale pilots/experiments were conducted in close partnership with the participating teaching academics on a trial-and-error basis.
These trials occurred prior to the approval of the formal project plan using a combination of ‘do it with’ and ‘do it for’ paths before the start of the formal project and its requirements constrained the approach strictly to ‘do it to’. The essence of this story is that the project’s success, as defined by senior management (Reed, 2014), is directly attributable to the tinkering and experimentation that occurred with the front-line academics, prior to the commencement of the formal project. The ‘do it with’ and ‘do it for’ components allowed the bricolage that made the implementation meaningful (Ciborra, 1992), while the ‘do it to’ component provided the resourcing necessary to progress the idea beyond the tinkering stage. Perhaps the key message from the EASI experience is that there needs to be balance between all three approaches if LA is to going to make sustained and meaningful contributions to L&T.
Conclusion
A story was told in this paper of an apparently successful ‘do it to’ LA project. It was suggested that this project was successful only because of its underpinning and preceding ‘do it with’ and ‘do it for’ processes. These processes allowed the project to adapt in response to the needs of the users over time, prior to the start of the formal project. Based on this experience and the theoretical frameworks described in this paper, it would appear likely that attempts to implement LA without sufficient ‘do it with’ will fail. Turn-key solutions and the increasing trend for ‘systems integration’ and outsourcing, is unlikely to allow the bricolage required for sustained and meaningful improvement in complex L&T contexts. There is even a question of how long the EASI project can remain successful given the formal project and its associated resourcing, will cease at the end of the project.
While this paper specifically targeted LA, there is a question as to whether the same paths, or combination thereof, are required more broadly for improving L&T in universities. Is the broader e-learning rhetoric/reality gap a result of an increasing amount of ‘do it to’ and ‘do it for’ and not enough ‘do it with’? How much effort are universities investing in each of the three paths? How could a university appropriately follow the ‘do it with’ path more often? What impacts might this have on the quality of learning and teaching? The exploration of these questions may help universities to bridge the gap between e-learning rhetoric and reality.
References
Alexander, S. (2001). E-learning developments and experiences. Education+ Training, 43 (4/5), 240-248.
Barnett, R. (2000). Supercomplexity and the Curriculum. Studies in Higher Education, 25 (3), 255-265. doi: 10.1080/03075070050193398
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future . Paper presented at the ASCILITE2012 Future challenges, sustainable futures, Wellingtone, New Zealand.
Birnbaum, R. (2001). Management fads in higher education: Where they come from, what they do, why they fail : Jossey-Bass San Francisco.
Boehm, B., & Turner, R. (2003). Using Risk to Balance Agile and Plan-Driven Methods. Computer, 36 (6), 57.
Boustani, M. A., Munger, S., Gulati, R., Vogel, M., Beck, R. A., & Callahan, C. M. (2010). Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system. Clinical Interventions In Aging, 5 , 141-148.
Camillus, J. C. (2008). Strategy as a Wicked Problem. Harvard Business Review, 86 (5), 98-106.
Carr-Chellman, A. A. (2004). Global perspectives on e-learning: Rhetoric and reality : Sage.
Cavallo, D. (2004). Models of growth—towards fundamental change in learning environments. BT Technology Journal, 22 (4), 96-112.
Ciborra, C. U. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8 (4), 297-309.
Diaz, V., & Brown, M. (2012). Learning analytics: A report on the ELI focus session. In Educause (Ed.), Educause Learning Initiative (Paper 2, 2012 ed., Vol. ELI Paper 2: 2012, pp. 18). Educause: Educause.
Duke, C. (2001). Networks and Managerialism: field-testing competing paradigms. Journal of Higher Education Policy & Management, 23 (1), 103-118. doi: 10.1080/13600800020047270
Elias, T. (2011). Learning analytics: Definitions, processes and potential. Learning, 23 , 134-148.
Ellis, R. A., Jarkey, N., Mahony, M. J., Peat, M., & Sheely, S. (2007). Managing Quality Improvement of eLearning in a Large, Campus-Based University. Quality Assurance in Education: An International Perspective, 15 (1), 9-23.
Fisher, J., Whale, S., & Valenzuela, F.-R. (2012). Learning Analytics: a bottom-up approach to enhancing and evaluating students'; online learning (pp. 18). University of New England: Office for Learning and Teaching.
Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the Paper presented at the 22nd Annual Conference of the International Business Schools Computing Association.
Gibson, J. W., & Tesone, D. V. (2001). Management fads: Emergence, evolution, and implications for managers. Academy of Management Executive, 15 (4), 122-133. doi: 10.5465/AME.2001.5898744
Goldfinch, S. (2007). Pessimism, computer failure, and information systems development in the public sector. Public Administration Review, 67 (5), 917-929.
Guan, J., Nunez, W., & Welsh, J. F. (2002). Institutional strategy and information support: the role of data warehousing in higher education. Campus –Wide Information Systems, 19 (5), 168.
Holland, J. (2006). Studying Complex Adaptive Systems. Journal of Systems Science and Complexity, 19 (1), 1 8. doi: 10.1007/s11424-006-0001-z
Inmon, W. H. (2002). Building the data warehouse / W.H. Inmon : New York ; Chichester : Wiley, c2002. 3rd ed. Introna, L. D. (1996). Notes on ateleological information systems development. Information Technology & People, 9 (4), 20-39.
Johnson, L., Adams Becker, S., Cummins, M., & Estrada, V. (2014). 2014 NMC Technology Outlook for Australian Tertiary Education: A Horizon Project Regional Report. In N. M. Consortium (Ed.), An NMC Horizon Project Regional Report . Austin, Texas.
Johnson, L., Adams, S., Cummins, M., Freeman, A., Ifenthaler, D., Vardaxis, N., & Consortium, N. M. (2013). Technology Outlook for Australian Tertiary Education 2013-2018: An NMC Horizon Report Regional Analysis. In T. N. M. Consortium (Ed.): New Media Consortium.
Johnson, L., Becker, S., Estrada, V., & Freeman, A. (2014). Horizon Report: 2014 Higher Education.
Johnson, L., Levine, A., Smith, R., & Stone, S. (2010). The 2010 Horizon Report : ERIC.
Johnson, L., Smith, R., Willis, H., Levine, A., Haywood, K., New Media, C., & Educause. (2011). The 2011 Horizon Report: New Media Consortium.
Jones, D., Beer, C., & Clark, D. (2013). The IRAC framework: Locating the performance zone for learning analytics . Paper presented at the Electric Dreams., Sydney. http://www.ascilite.org.au/conferences/sydney13/program/papers/Jones.pdf
Kurtz, C. F., & Snowden, D. J. (2003). The new dynamics of strategy: Sense-making in a complex and complicated world. IBM Systems Journal, 42 (3), 462-483.
Linden, A., & Fenn, J. (2003). Understanding Gartner’s hype cycles. Strategic Analysis Report Nº R-20-1971. Gartner, Inc .
Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks: Putting the learning back into learning analytics. Paper presented at the ASCILITE 2012,, Wellington.
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an ‚Äúearly warning system‚ for educators: A proof of concept. Computers & Education, 54 (2), 588-599. doi: 10.1016/j.compedu.2009.09.008
Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Journal of Educational Technology & Society, 15 (3), 149-163.
Maddux, C., & Cummings, R. (2004). Fad, fashion, and the weak role of theory and research in information technology in education. Journal of Technology and Teacher Education, 12 (4), 511-533.
Marchand, D. A., & Peppard, J. (2013). Why IT Fumbles Analytics. Harvard Business Review, 91 (1), 104-112.
Mason, M. (2008a). Complexity theory and the philosophy of education. Educational Philosophy and Theory, 40 (1), 15. doi: 10.1111/j.1469-5812.2007.00412.x
Mason, M. (2008b). What Is Complexity Theory and What Are Its Implications for Educational Change? Educational Philosophy and Theory, 40 (1), 35-49.
McConachie, J., Danaher, P. A., Luck, J., & Jones, D. (2005). Central Queensland University’s Course Management Systems: Accelerator or Brake in Engaging Change? International Review of Research in Open and Distance Learning, 6 (1).
New Media Consortium. (2012). The NMC Horizon Report, Higher Education Edition. In N. M. Consortium (Ed.), Horizon Project (2012 ed., Vol. 2012, pp. 36). Austin, Texas USA 78730: New Media Consortium and Educause Learning Initiative.
Pardo, A. (2013). Social learning graphs: combining social network graphs and analytics to represent learning experiences. International Journal of Social Media and Interactive Learning Environments, 1 (1), 43-58.
Paulsen, M. (2002). Online education systems in Scandinavian and Australian universities: A comparative study. The International Review of Research in Open and Distance Learning, 3 (2).
Plsek, P. E., & Greenhalgh, T. (2001). Complexity science: The challenge of complexity in health care. BMJ (Clinical Research Ed.), 323 (7313), 625-628.
Radloff, A. (2008). Engaging staff in quality learning and teaching: what’s a Pro Vice Chancellor to do? Sydney: HERDSA.
Ramamurthy, K. R., Sen, A., & Sinha, A. P. (2008). An empirical investigation of the key determinants of data warehouse adoption. Decision Support Systems, 44 (4), 817-841.
Reed, R. (2014, 10/7/2014). [EASI project success].
Rosh White, N. (2006). Tertiary education in the Noughties: the student perspective. Higher Education Research & Development, 25 (3), 231-246. doi: 10.1080/07294360600792947
Schiller, M. J. (2012). Big Data Fail: Five Principles to Save Your BI Butt. Retrieved 1/6/2014, 2014, from http://www.cioinsight.com/c/a/Expert-Voices/Big-Data-Fail-Five-Principles-to-Save-Your-BI-Butt-759074/
Scribner, J. P. (2005). The problems of practice: Bricolage as a metaphor for teachers’ work and learning. Alberta journal of educational research, 51
(4).
Sharples, M., McAndrew, P., Ferguson, R., FitzGerald, E., Hirst, T., & Gaved, M. (2013). Innovating Pedagogy 2013. In O. University (Ed.), (Report 2 ed.). Milton Keynes, United Kingdom: The Open University.
Shiell, A., Hawe, P., & Gold, L. (2008). Complex interventions or complex systems? Implications for health economic evaluation. BMJ, 336 (7656), 1281-1283.
Siemens, G. (2011). Learning and Knowledge Analytics. Retrieved 1/11/2011, 2011, from http://www.learninganalytics.net/?p=131
Siemens, G. (2013a). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57 (10), 1380-1400. doi: 10.1177/0002764213498851
Siemens, G. (2013b). [Systems level learning analytics].
Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46 (5), 9. Retrieved from Educause Review Online website: http://www.educause.edu/ero/article/penetratingfog-analytics-learning-and-education
Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS quarterly , 553-583.
Trigwell, K. (2001). Judging university teaching. International Journal for Academic Development, 6 (1), 65-73. doi: 10.1080/13601440110033698
Filed under: bad, indicators