Team:Exeter/HP/Intro

Human Practices

<centre>Human Practices</centre>

Over the course of the summer we learnt the importance of the relationship between science and society (human practices). This relationship is at its strongest when the potential impacts of new research or innovation are considered from the early stages of the project. This period provides the most opportunity to shape and control innovation with far fewer costs and vested interests. However, in this stage there is little or no evidence of how to adapt the innovation to be better suited to be integrated into society. This step is thus generally ignored and it is left to the luck of the innovation as to whether it leads to undesirable or harmful impacts (Williams, 1981). This means that due to the burden of imperfect foresight that the negative impacts could not be "reasonably foreseen" (Owen, 2013). This section will look at the failings of the current system and how we imposed a framework to best consider the impacts from the early stages.

Failings of the Current System

The current system using laws and standards to regulate innovation is failing to prevent harm being caused socially, environmentally or economically. Human practices is a concept that is not routinely considered by the wider scientific community outside of iGEM and it is left to the latter stages of the innovation process when the impacts and consequences of innovation are finally brought into question. These are governed by laws and standards that are in place to regulate innovations after they have been developed when there is evidence of harm (Lee, 2012). This form of regulation is poorly equipped to govern novel areas of science and technology, such as synthetic biology, which are highly uncertain of their current and future impacts as they have no historical precedent. As a result of this the knowledge of social norms against such innovation may be poorly defined, unclear or contested (Owen, 2013).

There are also long delays before we understand the wider impacts, implications and the consequences of the innovation (Pacces, 2010). This leads to a number of issues, such as preventing innovation being implemented as quickly as it should be or that harm may be inflicted by not investigating the long term effects prior to the release of the product. This is especially important as the stakes are now higher than ever with major advances in technology in our globalised civilisation bound to effect the vast majority of the population, potential unforeseen consequences or being too slow to implement new technology could prove fatal.

Another major issue is that by the time the procured knowledge has been obtained which led to a better understanding of the impacts of the innovation, it may already be locked into society so that there may be little appetite or power to do anything about it (Collingridge, 1980). This is a common issue seen in society where after long term studies it was found that issues such as micro plastics in cosmetics and the plastics used in clothing are polluting our oceans, and the burning of fossil fuels are driving anthropogenic climate change, creating issues we are ever struggling to deal with and replace. In these cases the cost of control becomes too great and vested interests resist any desire to change, which closes down the options to modulate or reshape the innovation (Stirling, 2007). This is known as the Dilemma of Control, and is one of the main issue with retrospectively investigating innovations (Owen, 2013).

Introduction to AREA

In order to make the most of the early stages of innovation a framework is required to ensure responsible research is considered throughout the project. After a seminar with Dr Sarah Hartley from the University of Exeter’s Business School, we became aware of responsible research and innovation (RRI) which is defined as:

Responsible Research and Innovation is a transparent, interactive process by which societal actors and innovators become mutually responsive to each other with a view on the (ethical) acceptability, sustainability and societal desirability of the innovation process and its marketable products (in order to allow a proper embedding of scientific and technological advances in our society) (Von Schomberg, 2011).

In order to embed this into our work we decided to use the AREA (Anticipate, Reflect, Engage, Act) framework, which was developed by Professor Richard Owen.For a detailed breakdown of the AREA framework please visit our silver human practices page.

Figure 1 shows the AREA framework, which we have continuously implemented throughout the course of our project. This framework mirrors existing approaches within biological engineering such as the Design, Build, Test, Learn paradigm, see figure 2. The parallels between these two paradigms facilitated the easy transfer of information obtained through communications with stakeholders to inform and implement the innovation. Examples for this can be found by looking at our integrated gold human practices page.

Figure 1: Anticipate, Reflect, Engage, Act framework
Figure 2: Design, Build, Test, Learn framework

References

Collingridge, D. The Social Control of Technology. Francis Pinter Ltd, London (1980)

Lee, R. G., Look at mother natureon the Run in the 21st century: responsible research and innovation. Transnational Environmental Law, 1:1, p. 105-117 (2012)

Owen, R., Responsible Innovation: Managing the Responsible Emergence of Science and Innovation in Society. Wiley, p.27-34 (2013)

Pacces, A., Consequences of uncertainty for regulation: Law and economics of the financial crisis. European Company & Financial Law Review, 7, 479-511 (2010)

Stirling, A., A general framework for analyising diversity in science, technology and society. Journal of Royal Science, Interface 4, p. 707-719 (2007)

Von Schomberg, R.,The quest for the "right" impacts of science and technology. An outlook towards a framework for responsible research and innovation. in Technikfolgen Abchätzen Lehren. Bildungspotenziale Transdisziplinärer Methoden (eds M.Dusseldorp, R. Beecroft). Springer-Verlag, p. 394. (2011)

Williams, B., Moral Luck. Cambridge: Cambridge University Press, p. 20-39 (1981)