EDCI 336 - Alex

This site will house my weekly blogs for the course, my free inquiry posts, and my group project

Weekly Reflection 10: SĂ€ljö (1999), “Learning as the Use of Tools: A sociocultural perspective on the human-technology link”

In this article, SÀljö encourages readers to adopt a view of technology which differs from the conventional, commonsensical one. Technology is not something separate and outside of human society and objective knowledge. Instead, technology, broadly conceived, is actually an integral part of our natural and social worlds. These have been so profoundly mediated and shaped by human technologies of all kinds as to be inseparable from them.

Traditional views of knowledge have not taken full account of the role of human technology in shaping the object of study. Empirical-realist views emphasize observation of the natural world and inductive reasoning to construct facts and knowledge. The other epistemological perspective, the idealist-rationalist one, takes the view that humans can construct knowledge about the world through deductive reasoning. Both views, however, seem to share the assumption that human knowledge exists outside of humans in objective form, and that this knowledge can be learned by humans. By contrast, the view which SĂ€ljö advances, the sociocultural perspective on human knowledge, takes a different view of the relation between technology and knowledge. Technologies, broadly conceived, and of both the physical and cognitive varieties, have always been part of the human experience. Humans have used various technological tools to relate to others and their environment and to solve physical and cognitive problems. The result of the use of these tools, however, has been that, on the one hand, our access to knowledge has been increased by them, and, on the other hand, the tools themselves have actually shaped the nature of ‘objective’ reality, both social and physical.

Examples of cognitive tools which have shaped the very nature of knowledge include mathematics and language. Both of these are basically pure tools. They help us have access to knowledge, but they also serve to construct the very nature of reality. Both give us categories with which to conceptualize the world and therefore bring into existence phenomena that would otherwise have no existence for us. Another example given by the author in this connection is the clock. The clock is a technology which gives us access to greater knowledge about time. At the same time, however, the clock actually defines and constructs the very nature of time.

Again, this view of technology, and the inexorable connection between it, our social and natural worlds, and our mental conceptions, is reminiscent of Marx. Recall that Marx proposed the existence of various ‘moments’ of human and social reality: nature, technology, social relations, relations of production, and mental conceptions. These are dialectically related so that changes in one ‘moment’ tend to bring about changes in the others. I believe that this is the correct way of conceptualizing the relation of technology to ourselves, nature and the social world. When a new technology is introduced, it serves to alter our social and economic relations, our relation to nature and (importantly for this article) our mental conceptions. Technology therefore plays a crucial role in constructing human knowledge itself.

If we conceive of technology in this broad way, and understand the way in which various technologies are inextricably bound up with human knowledge, a certain view of the role of technology in learning will suggest itself. In the classroom, we have little choice but to teach the mastering of the technological tools. Math and language are examples of subjects which are almost completely comprised of learning to master the tools themselves. For their part, ‘higher-tech’ technologies must be taught where they have become an inseparable part of our social or economic worlds, or when they have permanently transformed the nature of human knowledge and society. This is the case for technologies such as social media: human society is now basically inseparable from these tools.

Article:

https://ebookcentral.proquest.com/lib/uvic/reader.action?docID=166080&ppg=159

Weekly Reflection 9: Selwyn, Hillman, Eynon, Ferreira, Knox, Macgilchrist, and Sancho-Gil (2020), “What’s Next for Ed-Tech? Critical Hopes and Concerns for the 2020s”

What this article does is to summarize some of the main challenges which educational technology will bring up in the next decade. It is filled with interesting ideas, and as such, it constitutes a fitting conclusion for what is my final weekly reflection. Below, I will summarize what I consider to be the four most interesting challenges/issues and provide some of my own thoughts on these in addition to those which the authors present. These tend to take the form of questions which it is difficult to know exactly how to go about answering.

New forms of digital inclusion/exclusion

This involves asking the question of who benefits more from new technologies, and who benefits less. Traditionally, educational technology researchers have tended to view technology as inherently good, the inequalities generated by new technology simply having to do with the fact that some people (by virtue of socio-economic status and educational levels) are able to better absorb and integrate new technologies than others. However, as I have already stated in previous posts, this ignores the larger socio-economic framework which birthed the technology in the first place, and the imprint of this context which the technology carries with it. What if, Selwyn et al. ask, instead of being a function of differential ability to integrate new technologies, the latter tend to exacerbate pre-existing social inequalities and hierarchies? How does this work? Maybe the work world is getting increasingly polarized according to levels of technological fluency. As a result, if you have a pre-exiting advantage with technology due to higher educational levels or higher socio-economic status, technology only serves to increase the social distance between yourself and those people who do not have these advantages. If this is the case, that technology has an inherent tendency to exacerbate the socio-economic divide, is there a way to utilize technology in a way that would work against this inherent tendency, to reduce inequality?

Platform economics in an age of artificial intelligence

Here, the issue is that these tech corporations who are providing the platforms for educational institutions, are benefiting from the integration of these platforms by using them to mine student data, in order to target advertising to them and to ‘educate’ their artificial intelligence programs. This gets us into some of the same terrain as that concerning the use of Google in the classroom. As one person interviewed in that podcast said: “if you are getting a product for free, then you are the product” (my paraphrase). When integrating platforms such as Google into the classroom therefore, we need to do a cost/benefit analysis of what the company is gaining, what students and educators are getting and what the costs of this are. Is it worth selling student data in order to have access to this platform? What is the cost of putting up with targeted advertising, or the risk that one’s data will be used in attempts to commit fraud or scams against a person? What about the larger implications of helping companies perfect their AI? This is touched on in the following section. How is this AI used to shape subsequent human behavior, and what does it do to our political economy to have powerful AI, controlled by large companies, able to do the tasks that humans used to do?

‘Divisions of learning’ across humans and machines  

Artificial intelligence raises numerous questions for technology of education researchers, and for researchers across numerous other disciplines. Who controls and benefits from AI? How is it reshaping our political economy? And what is the implication for the future of education? Allow me unpack these questions a little. Large corporations control AI. One of the applications to which AI is put is to learn our thoughts, preferences, prejudices and behaviors. AI can then be used to sell us things, but it can also be used to influence our behaviors in others ways, that are not always socially advantageous. Maybe in an effort to garner our attention it is used to spit our worst tendencies and prejudices back to us in a way that intensifies them and social polarization with them, as can be seen with Facebook, for example. In addition, AI may be changing our political economy. Clearly the work world is in the process of being transformed as AI learns to do more and more tasks previously undertaken by humans. This may make many jobs, and the livelihoods that depend on them, obsolete in the near future. This raises larger questions about that nature of work and the way that the latter is linked to distribution of resources. If these two remain tightly connected, the way they are now, then AI is preparing the road to a future that is even more polarized, between those who control AI and the means of production and a growing army of unemployed, redundant, immiserated former workers. Finally, AI raises new questions about the future of education. Maybe educational researchers should also begin to look at the ‘education’ of AI. The former also need to theorize human learning in the context of rapidly evolving AI. What is the optimal type of skills that humans need to develop in the context of highly developed AI taking over many tasks previously fulfilled by humans? What types of professional niches can humans continue to occupy in a work world dominated by AI?

IT industry actors as a leading educational force

Selwyn et al. note that large tech corporations are the ones leading the charge to integrate new tech in the classroom. They say that while there is nothing inherently wrong with business taking such a leading role in education, sufficient oversight and regulation has to be developed to keep corporations from influencing educational policy in a way that benefits their own bottom line over and against the public interest in education. While I agree with the second half of this sentence, regarding the need for regulation and oversight of corporations involved in education, I could not disagree more with the sentence’s first clause, that there is nothing inherently wrong with corporations being involved in education. In fact, I would state the exact opposite: there is something inherently wrong and deleterious about having private business involved in education, in any way. As soon as private business becomes involved in education, the purpose of education and its ability to act in the public good get undermined. Business begins to subtly interfere with educational policy, putting its own profit motive above the objectives of forming democratic citizens. Business in the classroom can work to subtly shape the perceptions, attitudes, and narratives that are transmitted in the educational process, in a way that makes them more pro-business and which develops only the skills and attitudes needed to form the next generation of workers. When business becomes involved in education, the ability to of education to be an independent pillar of society, which stands apart from society and imparts the abilities to critique it, is eroded. The more business becomes involved in education, the more it (subtly and not so subtly) undertakes the reshaping of society in ways that benefit its own world view and consolidate its own domination over people, government, civil society. This is why, I believe, the entrance of corporations into public institution must be resisted, in principle, in all of its forms.

Article:

https://www.tandfonline.com/doi/pdf/10.1080/17439884.2020.1694945?needAccess=true

Weekly Reflection 8: Selwyn (2010), “Looking beyond Learning: Notes towards the Critical Study of Educational Technology”

This short but excellent article is about the need for a ‘critical’ approach to technology in education research.

Selwyn contends that most of the research in the field of technology in education shares the same basic flaws in approach. It tends to focus only on what is possible with new educational technology. It tends to examine technological possibilities divorced from the context on the ground. When it does try to diagnose why technology is not being taken up as planned, it takes the view that it is the fault of educators who are lacking in aptitudes or competencies for full technological uptake. The literature tends to take a Utopian view of technology and to view the latter in a fashion which disembeds it from the larger political, social, economic framework which gave rise to it and which also ignores the larger consequences at this same level of analysis. The result is a field of study which is rife with technological determinism – the propensity to see technology as this disembodied, outside force which cannot be resisted or negotiated but which can only be adapted to – and technological boosterism.

The solution to this situation, according to Selwyn, is the development of a ‘critical’ approach to technology in education research. What this implies is re-embedding technology in the social, political, economic context which gave birth to it, and examining the social, political, and economic consequences of its implementation. Technology is never neutral. Because it was created in the context of a class society, riven with relations of exploitation and oppressions, it is imprinted with these features at the very moment of its birth. Technology has the tendency therefore to act in ways which perpetuates and intensify these relations. It serves certain vested interests, while harming the interests of others. A critical approach to technology of education research would therefore take note of who is served by new technologies in education and how. Are large tech companies the beneficiaries of pushing for technology in the classroom? What are the stakes for ministries of education and individual schools? A critical approach to technology in education research would also examine the larger social, political and economic consequences of technology implementation in the classroom. I have discussed some of these issues in another reflection. They include the need to investigate such impacts as what the training of students in technology does to the labour market in high tech occupations, or the way in which technology in education serves the Neoliberal policy agenda. But, Selwyn points out, technology also has emancipatory potential. A critical approach to technology in education research would start from a view of society which includes class analysis and which is motivated by the desire to push forward a social justice agenda. The possibility for new technologies to further social justice and reduce inequality would therefore also be analysed within the critical technology of education framework.

I think this is a brilliant article. It amounts to call for a reorientation of the field of technology of education research towards undertaking a larger sociology of technology in education. Really, it advocates for embedding technology of education thinking inside of a more dialectical, historical-materialist view of society, technology and social change. Technology is not some autonomous force, disconnected from social and economic relations. This is the type of thinking which leads naĂŻve techno-Utopians to the view that the progressive automation of labour processes will lead to a leisure society within a few short decades. John Maynard Keynes himself fell for such a chimera. The problem with this analysis, of course, is that it forgets the social framework in which technology is created and implemented, which is a class framework. Automation does not lead to a leisure society because the relations of distribution are private and unequal. Automation therefore leads to unemployment and immiseration for many, and the intensification of work for others.

This same type of historical-material analysis needs to be undertaken for educational technologies. Selwyn’s article, however, is nothing more than an introduction to this approach. It leaves it to others to begin to do the more empirical, on the ground type of work that is prescribed. How would this research agenda be operationalized in concrete-complex fashion? Some of the questions that I have provided in the previous paragraph could act as starting points for more grounded research. What are the vested interests in the application of new technologies of education? How do large technology companies like Google benefit from it? What purpose does it serve for educational institutions? What are the larger societal impacts of new technologies? Do they facilitate the implementation of a Neoliberal policy agenda? How? These are some of the questions which I believe it would be germane to explore.    

Article:

https://onlinelibrary-wiley-com.ezproxy.library.uvic.ca/doi/pdfdirect/10.1111/j.1365-2729.2009.00338.x

Weekly Reflection 7: Oliver (2011), “Technological Determinism in Educational Technology Research: some Alternative Ways of Thinking about the Relationship between Learning and Technology”

This article makes the case that much of the research in the field of educational technology is tinged by certain common, underlying assumptions about the nature of technology and its role in learning and social change. Many of these assumptions could be placed under the rubric of ‘Technological Determinism.’ In the article, Oliver explains this concept of Technological Determinism, before attempting to find some approaches in educational technology research which fall under distinct paradigms.

Technological Determinism could be described as a view of technology that sees the latter as an autonomous force, arising on its own, outside of social processes, but which itself shapes social change in a rather direct and causal way. Educational research taking this view as a starting point tends to see technological change in the classroom as a given, inexorable process, that must lead to certain direct adaptations by teachers and learners. It tends to view the technologies themselves as largely neutral and holding no signature of societal structure and power. Technological determinism comes in two varieties. Technological optimists see technology as a prime, autonomous driver of social change, but consider this process to be largely good: negative outcomes from the adoption of technology fall under the category of ‘unintended consequences.’ Technological pessimists, on the other hand, are, as the name implies, more skeptical about the possible benefit of the adoption of new technologies, but still do not go so far as to question the underlying assumption that technological change is inevitable and is the prime mover in determining social change. Technological determinists, of both kinds, tend to assess the benefits or drawbacks of technological change on the basis of only the immediate outcomes in terms of efficiency gains and impacts on learners, but do not consider the larger social processes which give birth to technological change and the way in which the latter impinges on certain pre-existing social structures.

As an antidote to this view, we could do worse that referring back to the view of social change advanced by Marx, and the role of technology therein. This passage comes from a footnote in Volume 1 of Capital, in a chapter called “Machinery and Large-Scale Industry.” The footnote contains the following sentence, which is of great importance:

Technology reveals the active relation of man [sic] to nature, the direct process of production of his life and thereby it also lays bare the process of the production of the social relations of his life, and of the mental conceptions that flow from those relations.  

(p. 493)

We have various elements at play in the process of social change, therefore: individuals, their mental conceptions, their social relations, and technology. All of these processes are related and yet there is little direct causal relationship between them. Technology arises out of social relations, but it is not reduced to them (it only ‘reveals’ them). Mental conceptions are also at play, and there is always room for individual agency and actions against the backdrop of social structures. As Marx says elsewhere, in the 18Th Brumaire of Louis Bonaparte: “man makes his own history, but not under the conditions of his choosing” [my paraphrase].

This view of the dynamics of social change reminds us that technology is never an autonomous, natural force, which exists outside of social relations and which is neutral as far as class and power relations are concerned. Social relations themselves give rise to technology and help determine the shape it takes. Because we live in a class society, defined by inequality and exploitation, we ought to be initially skeptical of new technologies and to assume that they will have a tendency to exacerbate and intensify exploitative social relations. With respect to educational technology, we can see this tendency at work in the way in which the implementation of new classroom technology serves to expand the profits of the tech sector, increase societal dependence on these products, and advance the Neoliberal agenda by providing cover for the reduction of social spending in education. But nor does such a view foreclose the possibility that technology can be used to emancipatory ends. Indeed, Marx’s view of social change also delineates the way in which, in dialectical fashion, the seeds of the future society are present (albeit sometimes hard to make out) in present society. Technology, therefore, does contain the potential of being used to emancipatory ends, and there are countless empirical examples of this. The point is that if social relations give birth to technology, they also retain the power to shape its application. This, while human agency can also be used to decide how much technology will be adopted and how it will be used.

While I have just presented my own rebuttal to technological determinism, Oliver, in the article, scours the field of education and technology to try to find some alternative paradigms. Drawing from the work of Vygotsky, Activity Theory states that learning is a function of the deliberate interaction between individuals and their social environments, mediated by tools. The Communities of Practice approach defines learning as the development of identity and competency within a social group. Finally, Actor-Network Theory examines how people work in groups, aided by tools, to sustain (or fail to sustain) social processes.

In my view, none of these paradigms are really equipped to answer the questions posed by, or address the shortfalls of, technological determinism. They all see learning an outcome of interaction with, and integration into, a larger social group. Clearly, in this case, if this larger group is using technology to communicate and socialize, then what educators need to do is to give access to these technologies to students, and improve their abilities to use them. But this view never calls into question how these technologies come to be, what their political content is, and the larger social outcomes they produce. Nor do these approaches see technology as anything but an autonomous, natural, historical development. It does not account for the fact that human societies have agency about whether and how to use technologies. This is especially the case when it comes to the classroom setting, where educators have considerable power to shape the social environment. These approaches never ask whether and which technologies to integrate there and what their other effects will be.

The only other approach suggested by Oliver which seems to offer at least a partial remedy to technological determinism, in my view, is the Social Construction of Technology approach. The latter does indeed begin to look at the larger social forces behind technological development, as well as the social, moral and ethical consequences of adopting certain technologies. This approach, however, is, according to Oliver, underdeveloped in the educational technology research literature.  

Article:

https://onlinelibrary-wiley-com.ezproxy.library.uvic.ca/doi/pdfdirect/10.1111/j.1365-2729.2011.00406.x

« Older posts

© 2025 EDCI 336 – Alex

Theme by Anders NorenUp ↑