<?xml version="1.0" encoding="ISO-8859-1"?><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<front>
<journal-meta>
<journal-id>0873-6529</journal-id>
<journal-title><![CDATA[Sociologia, Problemas e Práticas]]></journal-title>
<abbrev-journal-title><![CDATA[Sociologia]]></abbrev-journal-title>
<issn>0873-6529</issn>
<publisher>
<publisher-name><![CDATA[Editora Mundos Sociais]]></publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id>S0873-65292009000300002</article-id>
<title-group>
<article-title xml:lang="en"><![CDATA[Technology, Complexity, and Risk: Social systems analysis of risky socio-technical systems and the likelihood of accidents]]></article-title>
<article-title xml:lang="pt"><![CDATA[Tecnologia, complexidade, e risco: análise social sistémica de sistemas sociotécnicos de risco e da probabilidade de acidentes]]></article-title>
<article-title xml:lang="fr"><![CDATA[Technologie, complexité et risque: analyse sociale systémique des systèmes socio-techniques de risque et de la probabilité d'accidents]]></article-title>
<article-title xml:lang="es"><![CDATA[Tecnología, complejidad, y riesgo: análisis social sistémico de sistemas sociotécnicos de riesgo y de la probabilidad de accidentes]]></article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Burns]]></surname>
<given-names><![CDATA[Tom R.]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
<xref ref-type="aff" rid="A02"/>
<xref ref-type="aff" rid="A03"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Machado]]></surname>
<given-names><![CDATA[Nora]]></given-names>
</name>
<xref ref-type="aff" rid="A04"/>
<xref ref-type="aff" rid="A05"/>
<xref ref-type="aff" rid="A06"/>
</contrib>
</contrib-group>
<aff id="A01">
<institution><![CDATA[,Lisbon University Institute  ]]></institution>
<addr-line><![CDATA[Lisbon ]]></addr-line>
<country>Portugal</country>
</aff>
<aff id="A02">
<institution><![CDATA[,Stanford University Woods Institute for the Environment ]]></institution>
<addr-line><![CDATA[Stanford ]]></addr-line>
</aff>
<aff id="A03">
<institution><![CDATA[,University of Uppsala Department of Sociology Uppsala Theory Circle]]></institution>
<addr-line><![CDATA[Uppsala ]]></addr-line>
<country>Sweden</country>
</aff>
<aff id="A04">
<institution><![CDATA[,Lisbon University Institute CIES - Centro de Investigação e Estudos de Sociologia ]]></institution>
<addr-line><![CDATA[Lisbon ]]></addr-line>
<country>Portugal</country>
</aff>
<aff id="A05">
<institution><![CDATA[,Stanford University  ]]></institution>
<addr-line><![CDATA[Stanford ]]></addr-line>
</aff>
<aff id="A06">
<institution><![CDATA[,University of Gothenburg Department of Sociology ]]></institution>
<addr-line><![CDATA[Gothenburg ]]></addr-line>
<country>Sweden</country>
</aff>
<pub-date pub-type="pub">
<day>00</day>
<month>12</month>
<year>2009</year>
</pub-date>
<pub-date pub-type="epub">
<day>00</day>
<month>12</month>
<year>2009</year>
</pub-date>
<numero>61</numero>
<fpage>11</fpage>
<lpage>40</lpage>
<copyright-statement/>
<copyright-year/>
<self-uri xlink:href="http://scielo.pt/scielo.php?script=sci_arttext&amp;pid=S0873-65292009000300002&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://scielo.pt/scielo.php?script=sci_abstract&amp;pid=S0873-65292009000300002&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://scielo.pt/scielo.php?script=sci_pdf&amp;pid=S0873-65292009000300002&amp;lng=en&amp;nrm=iso"></self-uri><abstract abstract-type="short" xml:lang="en"><p><![CDATA[This article conceptualizes the multi-dimensional "human factor" in risky technology systems and cases of accidents. A social systems theory is applied to the analysis of hazardous technology and socio-technical systems, their complex dynamics, and risky dimensions. The "human factor" is often vaguely identified as a risk factor in hazardous socio-technical systems, particularly when accidents occur. But it is usually viewed more or less as a "black box", under-specified and under-analyzed. Three key aims of the article are: (1) to identify and theorize in a systematic way the multi-dimensional character of the "human factor" in risky systems and accidents; (2) to enable the systematic application of a substantial social science knowledge to the regulation of hazardous systems, their managers and operatives as well as regulators, especially relating to the "human factor;" (3) to serve as an guiding tool for researchers and regulators in the collection and organization of data on human and other factors in risky systems and accidents. In sum, the article proposes a systematic approach to analyzing many of the diverse human risk factors associated with complex technologies and socio-technical systems, thus contributing knowledge toward preventing - or minimizing the likelihood of - accidents or catastrophes.]]></p></abstract>
<abstract abstract-type="short" xml:lang="pt"><p><![CDATA[Este artigo conceptualiza a multidimensionalidade do "factor humano" em sistemas tecnológicos de risco e em casos de acidentes. A teoria dos sistemas sociais é aplicável à análise de tecnologia perigosa e de sistemas sociotécnicos, às respectivas dinâmicas complexas e dimensões de risco. O "factor humano" é muitas vezes vagamente identificado como factor de risco em sistemas sociotécnicos de risco, particularmente quando ocorrem acidentes. Mas é usualmente visto mais ou menos como a "caixa-negra", subespecificado e subanalizado. Três objectivos fundamentais do artigo são: (1) identificar e teorizar de forma sistemática o carácter multidimensional do "factor humano" em sistemas de risco e em acidentes; (2) permitir a aplicação sistemática de conhecimento substancial da ciência social à regulação de sistemas perigosos, seus gestores e operadores bem como reguladores, especialmente relacionados com o "factor humano"; (3) servir como ferramenta de orientação para investigadores e reguladores na compilação e organização de dados sobre humanos e outros factores em sistemas de risco e acidentes. Em suma, o artigo propõe uma abordagem sistemática para analisar muitos dos diversos factores humanos de risco associados a tecnologias complexas e a sistemas sociotécnicos, contribuindo, assim, para o conhecimento preventivo - ou para minimizar a probabilidade - de acidentes ou catástrofes.]]></p></abstract>
<abstract abstract-type="short" xml:lang="fr"><p><![CDATA[Le présent article conceptualise la multidimensionnalité du &#8220; facteur humain&#8221; dans les systèmes technologiques de risque et en cas d'accidents. La théorie des systèmes sociaux est applicable à l'analyse de la technologie dangereuse et des systèmes socio-techniques, à leurs dynamiques complexes à leurs dimensions de risque. Le &#8220; facteur humain&#8221; est souvent vaguement identifié comme un facteur de risque dans des systèmes socio-techniques de risque, notamment en cas d'accidents. Mais il est habituellement plus ou moins considéré comme la " boîte noire ", sous-spécifié et sous-analysé. Cet article a trois objectifs fondamentaux : (1) identifier et théoriser systématiquement le caractère multidimensionnel du " facteur humain " dans les systèmes de risque et les accidents ; (2) permettre l'application systématique de la connaissance substantielle de la science sociale à la régulation des systèmes dangereux, à leurs gestionnaires, leurs opérateurs et aussi leurs régulateurs, ayant un lien particulier avec le " facteur humain " ; (3) servir comme outil d'orientation aux chercheurs et aux régulateurs dans la compilation et l'organisation des données sur les humains et autres facteurs dans des systèmes de risque et les accidents. En somme, l'article propose une approche systématique pour analyser la plupart des facteurs humains de risque associés aux technologies complexes et aux systèmes socio-techniques, contribuant ainsi à la connaissance préventive des accidents ou des catastrophes, ou à en minimiser la probabilité.]]></p></abstract>
<abstract abstract-type="short" xml:lang="es"><p><![CDATA[Este artículo conceptualiza la multidimensionalidad del "factor humano" en los sistemas tecnológicos de riesgo y en casos de accidentes. La teoría de los sistemas sociales es aplicable al análisis de la tecnología peligrosa y de los sistemas sociotécnicos, de las respectivas dinámicas complejas y dimensiones de riesgo. El "factor humano" es muchas veces vagamente identificado como factor de riesgo en sistemas sociotécnicos de riesgo, particularmente cuando ocurren accidentes. Sin embargo, es usualmente visto más o menos como la "caja-negra", subespecificado y subanalizado. Son tres los objetivos fundamentales de este artículo: (1) identificar y teorizar de forma sistemática el carácter multidimensional del "factor humano" en sistemas de riesgo y en accidentes; (2) permitir la aplicación sistemática del conocimiento substancial de las ciencias sociales a la regulación de sistemas peligrosos, sus gestores y operadores así como reguladores, especialmente relacionados con el "factor humano"; (3) servir como herramienta de orientación para investigadores y reguladores en la compilación y organización de datos sobre humanos y otros factores en sistemas de riesgo y accidentes. En suma, el artículo propone un abordaje sistemático para analizar muchos de los diversos factores humanos de riesgo asociados a las tecnologías complejas y a sistemas sociotécnicos, contribuyendo, así, para al conocimiento preventivo -- o para minimizar la probabilidad -- de accidentes o catástrofes.]]></p></abstract>
<kwd-group>
<kwd lng="en"><![CDATA[actor-system dynamics]]></kwd>
<kwd lng="en"><![CDATA[technology]]></kwd>
<kwd lng="en"><![CDATA[socio-technical system]]></kwd>
<kwd lng="en"><![CDATA[complexity]]></kwd>
<kwd lng="en"><![CDATA[risk]]></kwd>
<kwd lng="en"><![CDATA[risky system]]></kwd>
<kwd lng="en"><![CDATA[accident]]></kwd>
<kwd lng="en"><![CDATA[regulation and control]]></kwd>
<kwd lng="pt"><![CDATA[dinâmica actor-sistema]]></kwd>
<kwd lng="pt"><![CDATA[tecnologia]]></kwd>
<kwd lng="pt"><![CDATA[sistema sóciotécnico]]></kwd>
<kwd lng="pt"><![CDATA[complexidade]]></kwd>
<kwd lng="pt"><![CDATA[risco]]></kwd>
<kwd lng="pt"><![CDATA[sistema de risco]]></kwd>
<kwd lng="pt"><![CDATA[acidente]]></kwd>
<kwd lng="pt"><![CDATA[regulação e controle]]></kwd>
<kwd lng="fr"><![CDATA[dynamique acteur-système]]></kwd>
<kwd lng="fr"><![CDATA[technologie]]></kwd>
<kwd lng="fr"><![CDATA[système socio-technique]]></kwd>
<kwd lng="fr"><![CDATA[complexité]]></kwd>
<kwd lng="fr"><![CDATA[risque]]></kwd>
<kwd lng="fr"><![CDATA[système de risque]]></kwd>
<kwd lng="fr"><![CDATA[accident]]></kwd>
<kwd lng="fr"><![CDATA[régulation et contrôle]]></kwd>
<kwd lng="es"><![CDATA[dinámica actor-sistema]]></kwd>
<kwd lng="es"><![CDATA[tecnología]]></kwd>
<kwd lng="es"><![CDATA[sistema sociotécnico]]></kwd>
<kwd lng="es"><![CDATA[complejidad]]></kwd>
<kwd lng="es"><![CDATA[riesgo]]></kwd>
<kwd lng="es"><![CDATA[sistema de riesgo]]></kwd>
<kwd lng="es"><![CDATA[accidente]]></kwd>
<kwd lng="es"><![CDATA[regulación y control]]></kwd>
</kwd-group>
</article-meta>
</front><body><![CDATA[ <p> <B>Technology, Complexity, and Risk</B></P>     <p> <B>Social systems analysis of risky socio-technical systems and&nbsp;the&nbsp;likelihood  of accidents</B> </P>     <p> Tom R. Burns<a href="#a1">*</a> <a name="topa1"></a>and Nora Machado<a href="#a2">**</a>    <a name="topa2"></a> </P>     <p>&nbsp;</P>     <p><b>Abstract</b></P>     <p>This article conceptualizes the multi-dimensional &quot;human factor&quot;    in risky technology systems and cases of accidents. A social systems theory    is applied to the analysis of hazardous technology and socio-technical systems,    their complex dynamics, and risky dimensions. The &quot;human factor&quot; is    often vaguely identified as a risk factor in hazardous socio-technical systems,    particularly when accidents occur. But it is usually viewed more or less as    a &quot;black box&quot;, under-specified and under-analyzed. Three key aims    of the article are: (1) to identify and theorize in a systematic way the multi-dimensional    character of the &quot;human factor&quot; in risky systems and accidents; &nbsp;(2)    to enable the systematic application of a substantial social science knowledge    to the regulation of hazardous systems, their managers and operatives as well    as regulators, especially relating to the &quot;human factor;&quot; (3) to serve    as an guiding tool for researchers and regulators in the collection and organization    of data on human and other factors in risky systems and accidents. In sum, the    article proposes a systematic approach to analyzing many of the diverse human    risk factors associated with complex technologies and socio-technical systems,    thus contributing knowledge toward preventing - or minimizing the likelihood    of - accidents or catastrophes. </P>     <p> <U>Key-words</U> actor-system dynamics, technology, socio-technical system,    complexity, risk, risky system, accident, regulation and control. </P>     <p>&nbsp;</P>      <p><B>Resumo</B></P>     <p> <b>Tecnologia, complexidade, e risco: an&#225;lise social sist&#233;mica de&nbsp;sistemas    sociot&#233;cnicos de risco e da probabilidade de acidentes </b></P>     ]]></body>
<body><![CDATA[<p> Este artigo conceptualiza a multidimensionalidade do &quot;factor humano&quot;    em sistemas tecnol&#243;gicos de risco e em casos de acidentes. A teoria dos    sistemas sociais &#233; aplic&#225;vel &#224; an&#225;lise de tecnologia perigosa    e de sistemas sociot&#233;cnicos, &#224;s respectivas din&#226;micas complexas    e dimens&#245;es de risco. O &quot;factor humano&quot; &#233; muitas vezes vagamente    identificado como factor de risco em sistemas sociot&#233;cnicos de risco, particularmente    quando ocorrem acidentes. Mas &#233; usualmente visto mais ou menos como a &quot;caixa-negra&quot;,    subespecificado e subanalizado. Tr&#234;s objectivos fundamentais do artigo    s&#227;o: (1) identificar e teorizar de forma sistem&#225;tica o car&#225;cter    multidimensional do &quot;factor humano&quot; em sistemas de risco e em acidentes;    &nbsp;(2) permitir a aplica&#231;&#227;o sistem&#225;tica de conhecimento substancial    da ci&#234;ncia social &#224; regula&#231;&#227;o de sistemas perigosos, seus    gestores e operadores bem como reguladores, especialmente relacionados com o    &quot;factor humano&quot;; (3) servir como ferramenta de orienta&#231;&#227;o    para investigadores e reguladores na compila&#231;&#227;o e organiza&#231;&#227;o    de dados sobre humanos e outros factores em sistemas de risco e acidentes. Em    suma, o artigo prop&#245;e uma abordagem sistem&#225;tica para analisar muitos    dos diversos factores humanos de risco associados a tecnologias complexas e    a sistemas sociot&#233;cnicos, contribuindo, assim, para o conhecimento preventivo    &#151; ou &nbsp;para minimizar a probabilidade &#151; de acidentes ou cat&#225;strofes.  </P>     <P> <U>Palavras-chave</U> &nbsp;&nbsp;&nbsp;din&#226;mica actor-sistema, tecnologia,    sistema s&#243;ciot&#233;cnico, complexidade, risco, sistema de risco, acidente,    regula&#231;&#227;o e controle. </P>     <p>&nbsp;</P>     <p><B>R&#233;sum&#233;</B> </P>     <p> <b>Technologie, complexit&#233; et risque: analyse sociale syst&#233;mique    des syst&#232;mes socio-techniques de risque et de la probabilit&#233; d'accidents    </b> </P>     <p> Le pr&#233;sent article conceptualise la multidimensionnalit&#233; du &#147;    facteur humain&#148; dans les syst&#232;mes technologiques de risque et en cas    d'accidents. La th&#233;orie des syst&#232;mes sociaux est applicable &#224;    l'analyse de la technologie dangereuse et des syst&#232;mes socio-techniques,    &#224; leurs dynamiques complexes &#224; leurs dimensions de risque. Le &#147;    facteur humain&#148; est souvent vaguement identifi&#233; comme un facteur de    risque dans des syst&#232;mes socio-techniques de risque, notamment en cas d'accidents.    Mais il est habituellement plus ou moins consid&#233;r&#233; comme la &quot;    bo&#238;te noire &quot;, sous-sp&#233;cifi&#233; et sous-analys&#233;. Cet article    a trois objectifs fondamentaux : (1) identifier et th&#233;oriser syst&#233;matiquement    le caract&#232;re multidimensionnel du &quot; facteur humain &quot; dans les    syst&#232;mes de risque et les accidents ; (2) permettre l'application syst&#233;matique    de la connaissance substantielle de la science sociale &#224; la r&#233;gulation    des syst&#232;mes dangereux, &#224; leurs gestionnaires, leurs op&#233;rateurs    et aussi leurs r&#233;gulateurs, ayant un lien particulier avec le &quot; facteur    humain &quot; ; (3) servir comme outil d'orientation aux chercheurs et aux r&#233;gulateurs    dans la compilation et l'organisation des donn&#233;es sur les humains et autres    facteurs dans des syst&#232;mes de risque et les accidents. En somme, l'article    propose une approche syst&#233;matique pour analyser la plupart des facteurs    humains de risque associ&#233;s aux technologies complexes et aux syst&#232;mes    socio-techniques, contribuant ainsi &#224; la connaissance pr&#233;ventive des    accidents ou des catastrophes, ou &#224; en minimiser la probabilit&#233;. </P>     <P> <U>Mots-cl&#233;</U> &nbsp;&nbsp;dynamique acteur-syst&#232;me, technologie,    syst&#232;me socio-technique, complexit&#233;, risque, syst&#232;me de risque,    accident, r&#233;gulation et contr&#244;le. </P>     <P>&nbsp;</P>     <P><B>Resumen</B> </P>      <p> <b>Tecnolog&#237;a, complejidad, y riesgo: an&#225;lisis social sist&#233;mico    de&nbsp;sistemas sociot&#233;cnicos de riesgo &nbsp;y de la probabilidad de    accidentes </b></P>     ]]></body>
<body><![CDATA[<p> Este art&#237;culo conceptualiza la multidimensionalidad del &quot;factor    humano&quot; en los sistemas tecnol&#243;gicos de riesgo y en casos de accidentes.    La teor&#237;a de los sistemas sociales &nbsp;es aplicable &nbsp;al an&#225;lisis    de la tecnolog&#237;a peligrosa y de los sistemas sociot&#233;cnicos, de las    respectivas din&#225;micas complejas y dimensiones de riesgo. &nbsp;El &quot;factor    humano&quot; es muchas &nbsp;veces vagamente identificado como factor de riesgo    &nbsp;en sistemas sociot&#233;cnicos de riesgo, particularmente cuando ocurren    accidentes. Sin embargo, es usualmente visto m&#225;s o menos como la &quot;caja-negra&quot;,    subespecificado y subanalizado. Son tres los objetivos fundamentales de este    art&#237;culo: (1) identificar y teorizar de forma sistem&#225;tica el car&#225;cter    multidimensional del &quot;factor humano&quot; en sistemas de riesgo y en accidentes;    &nbsp;(2) permitir la aplicaci&#243;n sistem&#225;tica del conocimiento substancial    de las ciencias sociales a la regulaci&#243;n de sistemas peligrosos, sus gestores    y operadores as&#237; como reguladores, especialmente relacionados con el &quot;factor    humano&quot;; (3) servir como herramienta de orientaci&#243;n para investigadores    y reguladores en la compilaci&#243;n y organizaci&#243;n de datos sobre humanos    y otros factores en sistemas de riesgo y accidentes. En suma, el art&#237;culo    propone un abordaje sistem&#225;tico para analizar muchos de los diversos factores    humanos de riesgo asociados a las tecnolog&#237;as complejas y a sistemas sociot&#233;cnicos,    contribuyendo, as&#237;, para al conocimiento preventivo - o &nbsp;para minimizar    la probabilidad - de accidentes o cat&#225;strofes. </P>     <p> <U>Palabras-clave</U> &nbsp;&nbsp;&nbsp;din&#225;mica actor-sistema, tecnolog&#237;a,    sistema sociot&#233;cnico, complejidad, riesgo, sistema de riesgo, accidente,    regulaci&#243;n y control. </P>     <p>&nbsp;</P>     <p>&nbsp;</P>     <p> <B></B><B>Introduction</B> </P>     <p> The paper<A HREF="#1"><SUP>1</SUP></A><a name="top1"></a> introduces and applies    actor-system dynamics (ASD), a general systems theory, to the analysis of the    risks and accidents of complex, hazardous technologies and socio-technical systems.    Section 1 introduces ASD theory. Section 2 applies the theory to the analysis    of hazardous technologies and socio-technical systems, exposing cognitive and    control limitations in relation to such constructions (Burns and Deville, 2003;    Machado, 1990, 1998). The paper emphasizes the importance of investigating and    theorizing the particular ways in which institutional as well as individual    factors increase or decrease the potential risks and the incidence of accidents.  </P>     <p>&nbsp; </P>      <p> <B>Actor-system dynamics theory in a nutshell</B> </P>     <p> <I>Introduction</I> </P>     <p> Actor-system dynamics (ASD) emerged in the 1970s out of early social systems  analysis (Baumgartner, Burns and DeVille, 1986; Buckley, 1967; Burns, 2006a,  2006b; Burns, Baumgartner and DeVille, 1985; Burns and others, 2002). Social  relations, groups, organizations, and societies were conceptualized as  sets of inter-related parts with internal structures and processes. A key  feature of the theory was its consideration of social systems as open to,  and interacting with, their social and physical environments. Through interaction  with their environment &#151; as well as through internal processes &#151; such systems  acquire new properties and are transformed, resulting in emergent properties  and evolutionary developments. Another major characteristic of the theory  has entailed a conceptualization of human agents as creative (as well as  destructive) transformative forces. It has also been axiomatic from the  outset that human agents moral agents, shaping, reshaping, and implementing  normative and other moral rules. They have intentionality, they are self-reflective  and consciously self-organizing beings. They may choose, however, to deviate,  oppose, or act in innovative and even perverse ways relative to the norms,  values, and social structures of the particular social systems within which  they act and interact. </P>     ]]></body>
<body><![CDATA[<P> Human agents, as cultural    beings, are constituted and constrained by social rules and complexes of such    rules (Burns and Flam, 1987). These provide the major basis on which people    organize and regulate their interactions, interpret and predict their activities,    and develop and articulate accounts and critical discourses of their affairs.    Social rule systems are key constraining and enabling conditions for, as well    as the products of, social interaction (the duality principle). </P>     <p> The construction of ASD has entailed a number of key innovations: (1) the    conceptualization of human agents as creative (also destructive), self-reflective,    and self-transforming beings; (2) cultural and institutional formations constituting    the major environment of human behavior, an environment in part internalized    in social groups and organizations in the form of shared rules and systems of    rules; (3) interaction processes and games as embedded in cultural and institutional    systems which constrain, facilitate, and, in general, influence action and interaction    of human agents; (4) a conceptualization of human consciousness in terms of    self-representation and self-reflectivity on collective and individual levels;    (5) social systems as open to, and interacting with, their environment; through    interaction with their environment and through internal processes, such systems    acquire new properties, and are transformed, resulting in their evolution and    development; (6) social systems as configurations of tensions and dissonance    because of contradictions in institutional arrangements and cultural formations    and related struggles among groups; and (7) the evolution of rule systems as    a function of (a) human agency realized through interactions and games (b) and    selective mechanisms which are, in part, constructed by social agents in forming    and reforming institutions and also, in part, a function of physical and ecological    environments. </P>     <p>&nbsp;</P>     <p><i>General framework</i></P>     <p> This section identifies a minimum set of concepts essential to description  and model-building in social system analysis (see figure 1 below; the following  roman numerals are indicated in figure 1). </P>     <p> (I)&nbsp;&nbsp;&nbsp;&nbsp;The diverse constraints and facilitators of the actions and interactions  of human agents, in particular: (IA) Social structures (institutions and  cultural formations based on socially shared rule systems) which structure  and regulate agents and their interactions, determining constraints as  well as facilitating opportunities for initiative and transformation. (IB)  Physical structures which constrain as well as sustain human activities,  providing, for instance, resources necessary for life and material development.  Included here are physical and ecological factors (waters, land, forests,  deserts, minerals, other resources). (IA, IB) Socio-technical systems combine  material and social structural elements. (1A-S) and (1B-S) in figure 1  are, respectively, key social and material (or &#147;natural&#148;) structuring and  selection mechanisms that operate to constrain and facilitate agents&#180; activities  and their consequences; these mechanisms also allocate resources, in some  cases generating sufficient &#147;payoffs&#148; (quantity, quality, diversity) to  reproduce or sustain social agents and their structures; in other cases  not. </P>     <P> (II)&nbsp;&nbsp;&nbsp;&nbsp;Population(s)    of interacting social agents, occupying positions and playing different roles    <I>vis-a-vis</I> one another in the context of their socio-structural, socio-technical,    and material systems. Individual and collective agents are constituted and regulated    through such social structures as institutions; at the same time, they are not    simply robots performing programs or implementing rules but are adapting, filling    in particulars, and innovating. </P>     <p> (III)&nbsp;&nbsp;&nbsp;&nbsp;Social action and interaction (or game) processes    that are structured and regulated through established material and social conditions.<A HREF="#2"><SUP>2</SUP></A>    <a name="top2"></a>Social actors (individuals and collectives together with    interaction processes make up human agency. </P>     <p> (IV)&nbsp;&nbsp;&nbsp;&nbsp;Interactions result in multiple consequences and developments, intended  and unintended: productions, goods, wastes, and damages as well as impacts  on the very social and material structures that constrain and facilitate  action and interaction. That is, the actions IVA and IVB operate on the  structures IA and IB, respectively. Through their interactions, social  agents reproduce, elaborate, and transform social structures (for instance,  institutional arrangements and cultural formations based on rule systems)  as well as material and ecological conditions. </P>     <p> In general, while human agents &#151; individuals as well as organized groups,  organizations and nations &#151; are subject to institutional and cultural as  well as material constraints on their actions and interactions, they are  at the same time active, possibly radically creative/destructive forces,  shaping and reshaping cultural formations and institutions as well as their  material circumstances. In the process of strategic structuring, agents  interact, struggle, form alliances, exercise power, negotiate, and cooperate  within the constraints and opportunities of existing structures. They change,  intentionally and unintentionally &#151; even through mistakes and performance  failures &#151; the conditions of their own activities and transactions, namely  the physical and social systems structuring and influencing their interactions.  The results entail institutional, cultural, and material developments but  not always as the agents have decided or intended. </P>     ]]></body>
<body><![CDATA[<p> This model conceptualizes three different types of causal drivers, that  is factors that have the capacity to bring about or neutralize or block  change (that is, to change or maintain conditions or states of the social  as well as natural worlds). This multi-causal approach consists of causal  configurations or powers that affect the processes and outcomes of human  activities and developments (Burns and Dietz, 1992a). Three causal forces  are of particular importance and make up the &#147;iron triangle&#148; of human agency,  social structure, and environment. In particular: </P>     <P> (1)&nbsp;&nbsp;&nbsp;&nbsp;human agency causal matrix. Actors operate purposively    to effect their conditions; through their actions, they also have unanticipated    and un-intended impacts. As indicated in the diagram, actions and outcomes are    diverse (see III-IV in figure 1). Actors direct and influence one another; for    instance through affecting one another&#146;s cognitive and normative orientations.    Agential causality can operate either on process levels (that is, within an    institutional frame) as when those in positions of authority and power can influence    others or make particular collective decisions within given norms and other    constraints (see III in figure 1).<A HREF="#3"><SUP>3</SUP></A> <a name="top3" id="top3"></a></P>     <p> (2)&nbsp;&nbsp;&nbsp;&nbsp;Social structures (norms, values, and institutions) also generate a  type of causal force (IA-S). They pattern and regulate social actions and  interactions and their consequences; however, ASD theory recognizes, as  stressed earlier, that human agents may, under some conditions, ignore  or redirect these arrangements, thereby neutralizing or transforming the  causal forces of institutions and cultural formations. Our emphasis here  is on &#147;internal&#148; agents and social structures. Of course, &#147;external&#148; agents  and institutions typically impact on activities and developments within  any given social system. But these are special cases of factors (1) and  (2) referred to above. </P>     <p> (3)&nbsp;&nbsp;&nbsp;&nbsp;The natural and ecological causal complex is the    third type of causal force (IB-S). Purely environmental or &#147;natural&#148;    forces operate &#147;selecting&#148; and structuring (constraining/facilitating)    human actions and interactions &#151; at the same time that human agents have    to a greater or lesser extent impacts, in some cases massive impacts, on the    physical environments on which humanity and other species depend for survival,    as suggested in the model. </P>     <p>&nbsp;</P>     <p><i>Technology and socio-technical systems in the ASD framework</i></P>     <p> Technology,    as a particular type of human construction, is defined in ASD as a complex of    physical artifacts along with the social rules employed by social actors to    understand, utilize and manage the artifacts. Thus, technology has both material    and cultural-institutional aspects. Some of the rules considered are the &#147;instruction    set&#148; for the technology, the rules that guide its effective operation and    management. These rules have a &#147;hands on&#148;, immediate practical character    and can be distinguished from other rule systems such as the culture and institutional    arrangements of the socio-technical system in which the technology is imbedded.    The socio-technical system encompasses laws and normative principles as well    as other rules, specifying the legitimate or acceptable uses of the technology,    the appropriate or legitimate owners and operators, the places and times of    its use, the ways the gains and burdens (and risks) of applying the technology    should be distributed, and so on. The distinction between the specific instruction    set and the rules of the broader socio-technical system are not rigid, but the    distinction is useful for many analytical purposes. A socio-technical system    includes then the social organization (and, more generally, institutional arrangements)    of those who manage, produce, and distribute its &#147;products&#148; and &#147;services&#148;    to consumers and citizens as well as those (regulators, managers, and operatives)    who deal with the hazards of its use and its social, health, and environmental    impacts. </P>     <P> Such socio-technical systems as, for example, a factory, a nuclear power plant,    an air transport or electricity system, organ transplantation system (Machado,    1998), money systems (Burns and DeVille, 2003), or telecommunication network    consist of, on the one hand, complex technical and physical structures that    are designed to produce, process, or transform certain things (or to enable    such production) and, on the other hand, institutions, norms, and social organizing    principles designed to regulate the activities of the actors who operate and    manage the technology. The diverse technical and physical structures making    up parts of a socio-technical system may be owned and managed by different agents.    The knowledge including technical knowledge of these different structures is    typically dispersed among agents in diverse professions. Thus, a variety of    groups, social networks, and organizations may be involved in the design, construction,    operation, and maintenance of complex socio-technical systems. The diverse agents    involved in operating and managing a given socio-technical system require some    degree of coordination and communication. Barriers or distortions in these linkages    make for likely mal-performances or system failures. Thus, the &#147;human factor&#148;    explaining mis-performance or breakdown in a socio-technical system often has    to do with organizational and communicative features difficult to analyze and    understand (Burns and Dietz, 1992b; Burns and others, 2002; Vaughn, 1999). </P>     <P>&nbsp;</P>     <P><img src="/img/revistas/spp/n61/61a02f1.jpg" width="510" height="460"></P>     
]]></body>
<body><![CDATA[<P><b>Figure 1</b> General ASD model: the structuring powers and socio-cultural    and material embeddedness of interacting human agents </P>     <P>&nbsp;</P>     <p> Technologies are then more than bits of disembodied hardware; they function  within social structures where their usefulness and effectiveness is dependent  upon organizational structures, management skills, and the operation of  incentive and collective knowledge systems (Baumgartner and Burns, 1984;  Rosenberg, 1982: 247-8), hence, the importance of in our work of the concept  of socio-technical system. The application and effective use of any technology  requires a shared cognitive and judgment model or paradigm (Burns and others,  2002; Carson and others, 2009). This model includes principles specifying  mechanisms that are understood to enable the technology to work and its  interactions with its physical, biological, and socio-cultural environments.  Included here are formal laws of science as well as many ad-hoc &#147;rules  of thumb&#148; that are incorporated into technology design and use. </P>     <p> The concept of a socio-technical system implies particular institutional  arrangements as well as culture. Knowledge of technology-in-operation presupposes  knowledge of social organization (in particular, knowledge of the organizing  principles and institutional rules &#151; whether public authority, bureaucracy,  private property, contract law, regulative regime, professional skills  and competencies, etc. (Machado, 1998)). Arguably, a developed systems  approach can deal with this complexity in an informed and systematic way.  The model of a socio-technical system should always include a specification  and modeling not only of its the technology and technical infrastructure  but of its social organization and the roles and practices of its managers,  operatives, and regulators and the impacts of the operating system on the  larger society and the natural environment. </P>     <p> In the following sections, we apply ASD systems theory to the analysis  of hazardous technologies and socio-technical systems with some likelihood  of leading to accidents, that is, risky systems, and their more effective  management and regulation. </P>     <p>&nbsp;</p>    <P> <B>Conceptualizing    risky technologies and socio-technical systems</B> </P>     <p> <I>Risky innovations and risky systems</I> </P>     <p> Risky technologies and socio-technical systems are those which have the  potential (a certain (even if very low) likelihood, to cause great harm  on those involved, possibly partners or clients, third parties, other species,  and the environment. Some risky systems have catastrophic potential in  that they are capable in case of a performance or regulatory failure to  kill hundreds or thousands, wiping out species, or irreversibly contaminating  the atmosphere, water, or land. </P>     <p> There are a number of potentially hazardous systems which are designed  and operated to be low risk systems, for instance air traffic control systems.  When successful, they are characterized by a capacity to provide high qualities  of services with a minimum likelihood of significant failures that would  risk damage to life and property (LaPorte, 1978, 1984; LaPorte and Consolini,  1991). However, they are often costly to operate. The point is that humans  construct many hazardous systems (see later) that have the potential to  cause considerable harm to those involved, third parties, or the natural  environment. The key to dealing with these risks is &#147;risk manageability&#148;  &#151; the extent that hazards can be managed, effectively regulated. </P>     ]]></body>
<body><![CDATA[<p> Some technologies and socio-technical systems are much more risky than others,    e.g., systems of rapid innovation and development (Machado, 1990; Machado and    Burns, 2001) entail unknown hazards or hazards whose likelihood are also unknown    This has to do not only with the particular hazards they entail or generate,    but with the level of knowledge about them and the capacity as well as commitment    to control the systems. A hierarchical society with a powerful elite may have    a vision or model which it imposes, ignoring or downplaying key values and considerations    of particular weak groups or even overall sustainability. In other words, their    projects and developments generate risks for weak and marginal groups, and possibly    even for the sustainability of the society itself over the long-run. Typically,    this may be combined with suppression of open discussion and criticism of projects    and their goals. Even a highly egalitarian society may generate major risks,    for instance, when agents in the society are driven to compete in ways which    dispose them to initiate projects and transformations that are risky to the    physical and social environment. In this sense, particular institutional arrangements    such as those of modern capitalism<A HREF="#4"><SUP>4</SUP></A> <a name="top4"></a>effectively    drive competitiveness and high innovation levels (Burns, 2006a). For instance,    in the chemical sectors, new products and production processes tend to be generated    that without adequate regulation are risky for, among others, workers, consumers,    the environment, and long-term system sustainability. Risky systems arise also    from the fact that institutional arrangements and professional groups are inevitably    biased in terms of the values they institutionalize and realize through their    operations. They entail definitions of reality and social controls that may    block or prevent recognizing and dealing with many major types of risks from    technologies and technological developments (although &#147;risk analysis&#148;    and &#147;risk management&#148; are very much at the forefront of their discourses).  </P>     <P> Many contemporary developments are characterized by seriously limited or constrained    scientific models and understandings of what is going on and what is likely    to occur. At the same time many of these developments are revolutionizing human    conditions, and we are increasingly witnessing new discourses about bounded    rationality and risky systems. For instance, areas of the life sciences and    medicine are inspired by the modernist claims to ultimate knowledge and capacity    to control human conditions (Kerr and Cunningham-Burley, 2000; Machado and Burns,    2001).<A HREF="#5"><SUP>5</SUP></A> <a name="top5"></a>Consider several such    developments in the area of biomedicine that have led to unintended consequences    and new legal and ethical challenges of regulation. All of them have been launched    with much promise but they have entailed complex ramifications and the emergence    of major issues and problems not initially recognized or considered. </P>     <p> (1)&nbsp;&nbsp;&nbsp;&nbsp;Life support technologies &#151; life support entails a whole complex of  technologies, techniques, and procedures organized, for instance, in intensive  care units (ICUs). Initially, they were perceived as only a source of good  &#151; saving lives. Over time, however, they confronted hospitals, the medical  profession, the public, and politicians with a wide variety of new problems  and risks. The development has generated a variety of problematic (and  largely unanticipated) conditions. The increasing power of these technologies  has made death more and more into a construction, a &#147;deed. &#148; The cultural  implications of withholding and withdrawing treatment (&#147;passive euthanasia&#148;),  other forms of euthanasia, and increasingly &#147;assisted suicide, &#148; etc. have  led to diverse ethical dilemmas and moral risks and are likely to have  significant (but unknown for the moment) consequences for human conceptions  and attitudes toward death (Machado, 2005, 2009). </P>     <p> (2)&nbsp;&nbsp;&nbsp;&nbsp;The New Genetics &#151; the new genetics (Kerr    and Cunningham-Burley, 2000; Machado and Burns, 2001), as applied to human health    problems, involves an alliance of the biotechnology industry, scientists and    clinicians from an array of disciplinary backgrounds, and policy-makers and    politicians concerned with health care improvement as well cost reductions.    Genetic tests providing risk estimates to individuals are combined with expert    counseling so that those at health risk can plan their life choices more effectively.    Also, the supply of information about and control over their or their offspring&#146;s    genetic makeup is heralded as a new biomedical route not only to health improvement    but to liberation from many biological constraints However, its development    and applications is likely to lead to a number of dramatic changes, many not    yet knowable at this point in time:<A HREF="#6"><SUP>6</SUP></A><a name="top6"></a>    thus, there is emerging new conceptions, dilemmas, and risks relating health    and illness. And there are increasing problems (and new types of problem) of    confidentiality and access to information and protection of the integrity of    individuals. Genetic testing offers the potential for widespread surveillance    of the population&#146;s health by employers, insurance companies and the state    (via health care institutions) and further medicalisation of risk (Kerr and    Cunningham-Burley, 2000: 284).<A HREF="#7"><SUP>7</SUP></A> <a name="top7"></a>Finally,    there are major risks of &#145;backdoor eugenics&#146; and reinforcement of    social biologism as a perspective on human beings (Machado, 2007).<A HREF="#8"><SUP>8</SUP></A>    <a name="top8"></a> </P>     <P> (3)&nbsp;&nbsp;&nbsp;&nbsp;Xenotransplantation &#151; xenotransplantation    (transplantation of organs and tissues from one species, for instance pigs,    to another, mankind) began to develop in the late 1980&#146;s as a possible    substitute to organ replacement from human donors with the purpose of creating    an unlimited supply of cells and organs for transplantation (Hammer, 2001).    According to some observers, there are many uncertainties and risks not just    for the patient but also for the larger community. &#151; The risk of interspecies    transmission of infectious agents via xenografts has the potential to introduce    infectious agents including endogenous retroviruses into the wider human community    with unusual or new agents. Given the ethical issues involved in xenotransplantation    for, among others, the &#147;donor, &#148; the animals, and the potentials of    provoking animal rights movements, the risks are not negligible. The potential    of provoking animal rights movements (as in England) may reinforce a hostile    social climate that spill over and affect other areas not just concerning animal    welfare but also biotechnology and the important use of animals in bio-medical    testing.<A HREF="#9"><SUP>9</SUP></A> <a name="top9"></a></P>     <P> (4)&nbsp;&nbsp;&nbsp;&nbsp;Globalized    industrial food production &#151; today, an increased proportion of the fruits,    vegetables, fish, and meats consumed in highly developed countries is grown    and processed in less technologically developed countries. The procedures to    process food (e.g., pasteurization, cooking, canning) normally ensure safe products.    However, these processing procedures may fail in some less developed contexts.    For instance, increased outbreaks of some infectious diseases are associated    with animal herds (pigs, cattle, chickens). An important factor in these outbreaks    is the increasing industrialization of animal-food production in confined spaces    in many areas of the world that has propelled the creation of large-scale animal    farms keeping substantial number of, for example, pigs or chickens in highly    confined spaces. These conditions are commonly associated with a number of infectious    outbreaks and diseases in the animal population, many of them a threat to human    populations. Not surprisingly, this also explains in part the widespread use    of antibiotics in order to avoid infections and to stimulate growth in these    animal populations (increasing, however, the risk of antibiotic resistant infections    in the animals and humans) (Editorial, 2000).   The existing nationally or regionally based food security and health care infrastructures    are having increasing difficulty in effectively handling these problems. Earlier,    people were infected by food and drink, locally produced and locally consumed    &#151; and less likely to spread widely. </P>     <p> (5)&nbsp;&nbsp;&nbsp;&nbsp;Creation of many large-scale, complex systems &#151;    in general, we can model and understand only to a limited extent systems such    as nuclear-power plants or global, industrial agriculture,<A HREF="#10"><SUP>10</SUP></A>    <a name="top10"></a>global money and financial systems, etc. As a result, there    are likely to be many unexpected (and unintended) developments. What theoretical    models should be developed and applied to conceptualize and analyze such systems.    What restructuring, if any, should be imposed on these developments? How? By    whom? Complex systems are developed, new &#147;hazards&#148; are produced which    must be investigated, modeled, and controlled. At the same time, conceptions    of risk, risk assessment, and risk deliberation evolve in democratic societies.    These, in turn, feed into management and regulatory efforts to deal with (or    prevent) hazards from occurring (or occurring all too frequently). One consequence    of this is the development of &#147;risk consciousness&#148;, &#147;public risk    discourses&#148;, and &#147;risk management policies&#148;. Such a situation    calls forth public relations specialists, educational campaigns for the press    and public, manipulation of the mass media, formation of advisory groups, ethics    committees, and policy communities &#151; that have become equally as important    as research and its applications. They provide to a greater or lesser extent    some sense of certainty, normative order, and risk minimization. </P>     <p>&nbsp;</P>     <p> <I>Bounded knowledge and the limits of the control of complex risky technologies  and socio-technical systems</I> </P>     <p> Complex systems. Our knowledge of socio-technical systems &#151; including    the complex systems that humans construct &#151; is bounded.<A HREF="#11"><SUP>11</SUP></A>    <a name="top11"></a>Consequently, the ability to control such systems is imperfect.    First, there is the relatively simple principle that radically new and complex    technologies create new ways of manipulating the physical, biological, and social    worlds and thus often produce results that can not be fully anticipated and    understood effectively in advance. This is because they are quite literally    beyond the experiential base of existing models that supposedly contain knowledge    about such systems. This problem can be met by the progressive accumulation    of scientific, engineering, managerial, and other practical knowledge. However,    the body of knowledge may grow, even if this occurs, in part, as a consequence    of accidents and catastrophes. Even then, there will always be limits to this    knowledge development (Burns and Dietz, 1992b; Burns and others, 2001). </P>     ]]></body>
<body><![CDATA[<p> The larger scale and tighter integration of modern complex systems makes  these systems difficult to understand and control (Perrow, 1999; Burns  and Dietz, 1992b). Failures can propagate from one subsystem to another,  and overall system performance deteriorates to that of the weakest subsystem.  Subsystems can be added to prevent such propagation but these new subsystems  add complexity, and may be the source of new unanticipated and problematic  behavior of the overall system. Generally speaking, these are failures  of design, and could at least in principle be solved through better engineering,  including better &#147;human engineering&#148;. In practice, the large scale and  complex linkages between system components and between the system and other  domains of society make it very difficult to adequately understand these  complex arrangements. The result is not only &#147;knowledge problems&#148; but &#147;control  problems&#148;, because available knowledge cannot generate adequate scenarios  and predictions of how the system will behave under various environmental  changes and control interventions. </P>     <P> The greater the complexity    of a system, the less likely it will behave as the sum of its parts. But the    strongest knowledge that is used in many cases of systems design, construction    and management is often derived from the natural sciences and engineering, which    in turn are based on experimental work with relatively simple and isolated systems.    There is a lack of broader or more integrative representation. The more complex    the system, and the more complex the interactions among components, the less    salient the knowledge about the particular components becomes for understanding    the whole. In principle, experimentation with the whole system, or with sets    of subsystems, could be used to elucidate complex behavior. In practice, however,    such experiments become difficult and complex to carry out, too expensive and    risky because the number of experimental conditions required increases at least    as a product of the number of components. Actual experience with the performance    of the system provides a quasi-experiment, but as with all quasi-experiments,    the lack of adequate controls and isolation coupled with the complexity of the    system makes the results difficult to interpret. Typically competing explanations    cannot be dismissed. In any case, agreement on system description and interpretation    lags, as the system evolves from the state it started from at the beginning    of the quasi-experiment. This is one limit to the improvements that can be made    in the models, that is knowledge, of these complex systems. </P>     <p> When a system&#146;s behavior begins to deviate from the routine, operators  and managers must categorize or interpret the deviation in order to know  what actions to take. This process involves higher order rules, including  rules about what particular rules to use (&#147;chunking rules&#148;). Because the  exceptions to normal circumstances are by definition unusual, it is difficult  to develop much accumulated trial and error knowledge of them. As a result,  higher order rules often are more uncertain than basic operating rules,  and are more likely to be inaccurate guides to how the system will actually  behave under irregular conditions. This is another way in which complexity  hinders our ability to develop an adequate understanding and control of  the system. </P>     <p> Technical division of labor &#151; designers, builders and operators of the  system are often different people working in very different contexts and  according to different rules with different constraints. Each may be more  or less misinformed about the rule systems used by the others. Designers  may define a rigid set of rules for operators, thus allowing designers  to work with greater certainty about system performance. But since the  system model is imperfect, these rigid rules are likely to prevent operators  from adjusting to the real behavior of the system. When they do make such  adjustments &#151; that are often useful in the local context &#151; but they are  deviating, of course, from the formal rule system, and, from the viewpoint  of the systems designer, can be considered &#147;malfunctioning&#148; components.  A further factor is the length of human life and of career patterns. This  makes sure that the system&#146;s original designers are often not around anymore  when operators have to cope with emergent problems, failures and catastrophes.  System documentation is as subject to limitations as model building and  thus assures that operators will always be faced with &#147;unknown&#148;system characteristics. </P>     <P> Problems of authority    and management &#151; a hierarchy of authority creates different socio-cultural    contexts for understanding the system and differing incentives to guide action.    As one moves up in the hierarchy, pressure to be responsive to broader demands,    especially demands that are external to the socio-technical system, become more    important. The working engineer is focused on designing a functional, safe,    efficient system or system component. Her supervisor in the case of a business    enterprise must also be concerned not only with the work group&#146;s productivity,    but with the highest corporate officials preoccupation with enterprise profitability,    and the owners of capital with the overall profitability of their portfolio.    Because most modern complex systems are tightly linked to the economy and polity    these external pressures at higher levels can overwhelm the design logic of    those who are working &#147;hands-on&#148; in systems design, construction and    operation. In some cases, this may be the result of callous intervention to    meet profit or bureaucratic incentives. In other cases it may be the result    of innocent &#147;drift&#148;. But in either situation, the result is much the    same &#151; the operating rules or rules-in-practice are at odds with the rules    that were initially designed to optimize systems design, construction, and operation.    </P>     <p> In addition to these macro-level interactions between the complex system and    the other rule governed domains of society, there are meso-and micro-level processes    at work. Managerial and other cohorts must gand othersong with one another and    accommodate each other as individuals or groups. The day to day interaction    inside and often outside the workplace makes internal mechanisms of auditing    and criticism difficult to sustain. The &#147;human factor&#148; thus enters    in in the form of deviance from safe practices and miscalculations, mistakes,    and failures of complex systems.<A HREF="#12"><SUP>12</SUP></A> <a name="top12"></a></P>     <p> A less recognized, problem is that the processes of selection acting on  rules and the processes of rule transmission will not necessarily favor  rules that are accurate models of the interaction between technology and  the physical, biological and social worlds. Perhaps in the very long run  the evolutionary epistemology of Karl Popper and Donald Campbell will produce  an improved match between the rule system of a culture and &#147;truth&#148; but  there is no guarantee that this will occur in the short run in any given  culture. Even relatively simple models of cultural evolution demonstrate  that disadvantageous traits can persist and even increase in frequency.  The existing structure of a culture may make difficult the spread of some  rules that, whatever their verisimilitude, are incongruous with other existing  rules. Nor is this necessarily an unconscious process. Individuals with  power may favor and sustain some rules over others, whatever their actual  utility or veracity in relation to the concrete world. </P>     <P> The bounded rationality    of models &#151; we must recognize that the idea of bounded rationality applies    to models as much as to people or organizations, since models are developed    and transmitted by people and organizations. Human individuals and organizations    use information-processing patterns that involve heuristics and biases, simplifications,    rules of thumb and satisficing in searches for answers. In addition, since many    contemporary systems including technologies are too complex for any single individual    to understand fully, problems in model development result from the process of    aggregating individual understandings into a collectively shared model. Aggregation    of individual understandings and attendant models provide cross-checks and a    larger pool of understanding on which to draw, and, in that way, the collective    model will be preferable to individual models, which, even if not seriously    flawed in other ways, will inevitably be incomplete. But problems of group dynamics    and communication interfere with accurate modeling by a group. Groups always    have agendas and dynamics that are to a large degree independent of the formal    tasks to which they are assigned. These perspectives and agendas mean that there    are more goals &#147;around the table&#148; than simply developing the best    possible or most accurate operative model. Alternative goals can lead to decisions    about the model construction that results in a specific model less accurate    than would otherwise be possible. Delphi and other group process methods were    developed specifically because of these group process problems in technological    decision making. </P>     <p> In sum, problems of individual and collective understanding and decision-making  lead to flawed models (Burns and Dietz, 1992b). Formal models may often  be used to get past these problems, but they cannot eliminate them entirely.  Here we note that models are limited even when all the biases of individual  and group decision making are purged from them. A model of a complex system  is typically built by linking models of simple and relatively well understood  component systems. Thus, each element of the formal model is in itself  a model of reality that eventually must be a translation from an individual  or group understanding to a formal, explicit, possibly mathematical, understanding  of that reality. For simple processes, both the understanding and the translation  into a mathematical model may be reasonably accurate and complete. But  not all subsystems of a complex system are well understood. This leads  to a tendency to model those processes that are well understood, usually  the linear and physical features of the system, and ignore or greatly simplify  elements that are not well understood. In such models, &#147;bad numbers drive  out good paragraphs&#148;. As a result, human operators are modeled as automatons  and the natural environment as a passive sink for effluent heat, materials,  etc. In addition, the long history of trial and error experimentation with  the isolated components of the system, particular physical components,  has led to laws describing them in ways that are reasonably precise and  accurate. This halo of precision and accuracy is often transferred to other  elements of the system even though they are less well researched and cannot  be subject to experimental isolation. And while some of the subsystems  may be relatively well understood in themselves, it is rare that the links  between the systems are understood. This is because such links and the  resulting complexities are eliminated intentionally in the kinds of research  and modeling that characterize most physical science and engineering. Again,  the halo effect applies, and a technological hubris of overconfidence and  limited inquiry may result. Finally, we should note that the model used  to design and control the behavior of the system is in itself a part of  the system. Since it cannot be isomorphic with the system, the behavior  of the model must be taken into account when modeling the system, leading  to an infinite regress. </P>     <P> The functioning and consequences of many innovations cannot be fully specified    or predicted in advance. Of course, tests and trials are usually conducted.    In the case of complex systems, however, these cover only a highly selective,    biased sample of situations. Performance failings in diverse, in some cases    largely unknown environments, will be discovered only in the context of operating    in these particular environments.<A HREF="#13"><SUP>13</SUP></A> <a name="top13"></a>Not    only is it not possible to effectively identify and test all impacts (and especially    long-term impacts) of many new technologies, whose functioning and consequences    are difficult to specify. But there are minimal opportunities to test complex    interactions. Among other things, this concerns the impact of new technologies    on human populations, where typically there is great variation in people&#146;s    sensitivity, vulnerability, and absorption, etc. </P>     ]]></body>
<body><![CDATA[<p> Of course, the critical criterion for model adequacy is whether or not the    model is helpful in designing and controlling the system. A model, though inevitably    incomplete and inaccurate, may be sufficiently complete and accurate to be of    great practical value. But we also note that there are strong tendencies for    such models to be more inaccurate and incomplete in describing some aspects    of the system than others &#151; particularly in describing complex interactions    of components of the system, the behavior of the humans who construct, manage,    and operate the system, and the interactions of the systems with the natural    and social environments. The failure to understand the internal physical linking    of the system usually calls for more sophisticated research and modeling. The    failure to understand human designers, builders and operators is labeled &#147;human    error&#148; on the part of designers, builders and operators, rather than as    an error in the systems model. These failings speak for a more sophisticated    social science modeling of the &#147;human factor&#148; in relation to complex    technologies and socio-technical systems, as the next section sets out to accomplish.  </P>     <p>&nbsp;</P>     <p> <I>The complexity of governance systems and regulative limitations</I> </P>     <p> We have suggested here the need for more integrative approaches. This is  easier said than done. Modern life is characterized by specialization and  the fragmentation of knowledge and institutional domains. There is a clear  and present need for an overarching deliberation and strategies on the  multiple spin-offs and spill-overs of many contemporary technology developments  and on the identification and assessment of problems of incoherence and  contradiction in these developments. </P>     <p> That is, problems of integration are typical of many technological issues  facing us today. National governments are usually organized into ministries  or departments, each responsible for a particular policy area, whether  certain aspects of agriculture, environment, foreign affairs, trade and  commerce, finance, etc. Each ministry has its own history, interests and  viewpoints, and its own &#147;culture&#148; or ways of thinking and doing things.  Each is open (or vulnerable) to different pressures or outside interest  groups. Each is accountable to a greater or lesser extent to the others,  or the government, or the public in different ways. </P>     <p> Policy formulation, for example in the area of bio-diversity, cuts across    several branches of a government, involves forums outside of the government    or even outside inter-governmental bodies. And in any given forum, a single    ministry may have its government&#146;s mandate to represent and act in its    name. This might be the ministry of foreign affairs, asserting its authority    in forums which may also be the domains of other ministries (e.g., Agriculture-FAO;    Environment-UNEP).<A HREF="#14"><SUP>14</SUP></A><a name="top14"></a> A variety    of NGOs are engaged. Consider agricultural-related bio-diversity. It is perceived    in different ways by the various actors involved: for instance, (i) as part    of the larger ecosystem; (ii) as crops (and potential income) in the farmer&#146;s    field; (iii) as raw material for the production of new crop varieties; (iv)    as food and other products for human beings; (v) as serving cultural and spiritual    purposes; (vi) as a commodity to sell just as one might sell copper ore or handicrafts;    (vii) or as a resource for national development. In short, many different interest    groups are in fact interested in it. </P>     <p> Consequently, there is substantial complexity and fragmentation of policymaking  concerning bio-diversity. As (Fowler, 1998: 5) stresses: &#147;Depending on  how the &#145;issue&#146; is defined, the subject of agro-biodiversity can be debated  in any of a number of international fora, or in multiple fora simultaneously.  It can be the subject of debate and negotiation in several of the UN&#146;s  specialized agencies, inter alia, the Food and Agriculture Organization  (FAO), the UN Development Programme (UNDP), the UN Environment Programme  (UNEP), the UN Conference on Trade and Development (UNCTAD), the World  Health Organization (WHO), the International Labour Organization (ILO),  the UN Economic, Social and Cultural Organization (UNESCO), the World Trade  Organization (WTO), the UN&#146;s Commission on Sustainable Development, or  through the mechanism of a treaty such as the Convention of Biological  Diversity.&#148; Each might assert a logical claim to consider some aspect of  the topic; thus, government agencies might pursue their interests in any  of these fora, choosing the one, or the combination, which offers the greatest  advantage. Some ministries within some governments &#151; may consider it useful  to try to define the issues as trade issues, others as environmental issues,  and still others as agricultural or development issues. But in each case  a different forum with different participants and procedures would be indicated  as the ideal location for struggle, according to the particular framing  (Fowler, 1998: 5). </P>     <P> Fowler (1998: 5) goes    on to point out: &#147;The multiplicity of interests and fora, and the existence    of several debates or negotiations taking placing simultaneously, can tax the    resources of even the largest governments and typically lead to poorly coordinated,    inconsistent and even contradictory policies. To some extent, contradictory    policies may simply demonstrate the fact that different interests and views    exist within a government. Contradictions and inconsistencies may, amazingly,    be quite local and purposeful. But, in many cases, ragged and inconsistent policies    can also be explained in simpler terms as poor planning, coordination and priority    setting. More troubling is the fact that discordant views enunciated by governments    in different negotiating fora can lead to lack of progress or stalemate in all    fora. &#148; </P>     <p> This case, as well as many others, illustrates the complexity of policymaking    and regulation in technical (and environmental) areas. Further examples can    be found in numerous areas: energy, information technology, bio-technologies,    finance and banking, etc. The extraordinary complexity and fragmentation of    the regulation environment make for high risky systems, as controls work at    cross-purposes and breakdown. There is an obvious need for more holistic perspectives    and long-term integrated assessments of technological developments, hazards,    and risks. </P>     <p>&nbsp;</P>     ]]></body>
<body><![CDATA[<p> <I>Toward a socio-technical systems theory of risky systems and accidents<A HREF="#15"><SUP>15</SUP></A></I>    <a name="top15"></a> </P>     <p> Our scheme of complex causality (see figure 1), enables us to identify  ways in which configurations of controls and constraints operate in and  upon hazardous systems to increase or decrease the likelihood of significant  failures that cause, or risk causing, damage to human life and property  as well as to the environment. For example, in a complex system such as  an organ transplant system, there are multiple phases running from donation  decisions and extraction to transplantation into a recipient (Machado,  1998). Different phases are subject to more or less differing laws, norms,  professional and technical constraints. To the extent that these controlling  factors operate properly, the risks of unethical or illegal behavior or  organ loss as well as transplant failure are minimized &#151; accidents are  avoided &#151; and people&#146;s lives are saved and their quality of life is typically  improved in ways that are considered legitimate. Conversely, if legal,  ethical, professional, or technical controls breakdown (because of lack  of competence, lack of professional commitment, the pressures of contradictory  goals or groups, organizational incoherence, social tensions and conflicts  among groups involved, etc.), then the risk of failures increase and either  invaluable organs are lost for saving lives or the transplant operations  themselves fail. Ultimately, the entire system may de-legitimized and public  trust and levels of donation and support of transplantation programs decline. </P>     <p> Our ASD conceptualization of risky systems encompasses social structures,  human agents (individuals and collective), technical structures, and the  natural environment (in other analytic contexts, the social environment  is included) and their interplay (see figure 1). Our classification scheme  presented below in table 1 uses the general categories of ASD: agency,  social structure, technical structure, environment, and systemic (that  is the linkages among the major factor complexes). </P>     <p> Section 2 pointed out that the theory operates with configurations of causal    factors (that is, the principle of multiple types of qualitatively different    causalities or drivers applies):<A HREF="#16"><SUP>16</SUP></A><a name="top16"></a>    in particular, the causal factors of social structure, technical structure,    and human agency (social actors and their interactions) as well as environmental    factors (physical and ecosystem structures) and the interplay among these complexes    (see figure 1). These causal factors are potential sources of failings, or the    risks of failure, in the operation or functioning of a hazardous technology    system., that is, the likelihood of accidents with damage to life and property    as well as to the environment (even for low-hazardous systems, there are of    course problems of failures). </P>     <p> Below we exhibit in table 1 how ASD theory enables the systematic identification  of the values of variables which make for a low-risk or, alternatively,  a high-risk operation of the same socio-technical system. </P>     <p> Table 1 also indicates how our models enable a bridging of the large gap between    Perrow&#146;s framework and that of LaPorte on highly reliable risky systems    (also, see Rosa (2005) about the challenge of this bridging effort of normal    occupants of complex systems). On the one hand, Perrow has little to say about    agential forces and also ignores to a great extent the role of turbulent environments.    On the other hand, LaPorte does not particularly consider some of the technical    and social structural features which Perrow emphasizes such as non-linear interactions    and tight-coupling (Perrow, 1999). The ASD approach considers agential, social    and technical structural, and environmental drivers and their interplay. It    allows us to analyze and predict which internal as well as external drivers    or mechanisms may increase &#151; or alternatively decrease &#151; the risk    of accidents and how they do this, as suggested in the Table below. For the    sake of simplifying the presentation, we are making only a rough, dichotomous    distinction between low-risk and high-risk hazardous systems. In an elaborated    multi-dimensional space, we would show how the scheme can be used to analyze    and control each factor in terms of degree of risk it entails to the functioning    and potential failure of the functioning of a hazardous socio-technical system,    which can cause harm to life, property, and the environment. </P>     <p>&nbsp;</P>     <p><b><a href="/img/revistas/spp/n61/61a02t1.jpg" target="_blank">Table 1</a> </b>The multiple    factors of risk and accident in the case of hazardous systems: a socio-technical    system perspective</P>     
<p>&nbsp;</P>     <P> Performance failures    in risky systems must be comprehended in social as well as physical systems    terms encompassing the multiple components and the links among components of    subsystems including individual and collective agents, social structures such    as institutions and organizations, and material structures including the physical    environment. Risky or hazardous behavior results from the inadequate or missing    controls or constraints in the system (Leveson, 2004). </P>     ]]></body>
<body><![CDATA[<p> A basic principle of ASD systems analysis is that effective control of  major performance factors (processes as well as conditions) will reduce  the likelihood of failures and accidents and make for low-risk system performance,  even in the case of highly hazardous systems (that is, LaPorte&#146;s type of  model). But the absence or failure of one or more of these controls and  constraints will increase the risk of failure or accident in the operating  system (notice that Perrow&#146;s cases are included here in a much broader  range of cases, as suggested in table 1 above, column 3). As LaPorte and  colleagues (LaPorte, 1984; LaPorte and Consolini, 1991) emphasize on the  basis of their empirical studies of highly reliable systems, design, training,  development of professionalism, redundant controls, multi-level regulation  and a number of other organizational and social psychological factors can  effectively reduce the risk levels of hazardous systems. </P>     <p> Accidents in hazardous systems occur whenever one or more components (or  subsystems) of the complex socio-technical system fail, or essential linkages  breakdown or operate perversely. Such internal as well as external &#147;disturbances&#148;  fail to be controlled or buffered (or adapted to) adequately by the appropriate  controllers. </P>     <P>The table provides a more or less static picture, but more dynamic considerations    and analyses readily follow from the ASD model (see figure 2). </P>     <P>&nbsp;</P>     <P><img src="/img/revistas/spp/n61/61a02f2.jpg" width="259" height="245"></P>     
<P><b>Figure 2</b> Impact of societal changes on the &#8220;human&#8221; and other    factors of socio-technical systems</P>     <P>&nbsp;</P>     <p> Thus, one can model and analyze the impact of social, cultural, and political  factors on the institutional arrangements and the behavior of operatives,  managers, and regulators of hazardous socio-technical systems. The following  points are a selection of only a few of the possible social systemic analyses  implied. </P>     <p> (I)&nbsp;&nbsp;&nbsp;&nbsp;Levels of knowledge and skills of managers, operatives and regulators  may decline with respect to, for instance, the socio-technical system and  its environmental conditions and mechanisms. Innovations may be introduced  in the socio-technical system with effects which go far beyond established  knowledge of operatives, management, and regulators. </P>     <p> (II)&nbsp;&nbsp;&nbsp;&nbsp;Levels of professionalism and commitment may decline because of problems  (including costs) of recruitment, training, or further education of managers  and operatives. Or, the decline may take place because of changes in values  and norms in society, for instance, there emerges an increased emphasis  on economic achievements, cost-cutting and profitability, increasing the  likelihood of risk-taking and accidents. That is, values and goals unrelated  to minimizing the risk of accidents are prioritized over safety. </P>     ]]></body>
<body><![CDATA[<p> (III)&nbsp;&nbsp;&nbsp;&nbsp;There occurs declining capability or interest in understanding, modeling,  or managing the &#147;human factor&#148;. </P>     <p> (IV)&nbsp;&nbsp;&nbsp;&nbsp;Highly competitive systems, such as capitalism, drive innovation beyond  the full knowledge and skills of operatives, managers, and regulators. </P>     <p> (V)&nbsp;&nbsp;&nbsp;&nbsp;Local adaptations or innovations are not communicated to others outside  the local context, so that the stage is set for incompatibilities, misunderstandings,  and accidents. The same type of contradictory development may occur between  levels as well, that is, for instance, changes at the top management level  are not communicated to or recognized by subordinate groups. Or, vice versa,  shifts occur in operative groups&#146; beliefs and practices that impact on  risk-taking and safety without this being communicated to or perceived  by managers and/or regulators. </P>     <p> (VI)&nbsp;&nbsp;&nbsp;&nbsp;As stressed earlier, many interactions (or potential    interactions) in a system or between the system and its environment are not    recognized or adequately modeled. Indeed, some could not have been modeled or    anticipated. They are emergent (Burns and DeVille, 2003; Machado and Burns,    2001).<A HREF="#17"><SUP>17</SUP></A> <a name="top17"></a>Thus, new types of    hazards, vulnerabilities, and human error emerge, which the established paradigm    of professionalism, training, and regulation fails to fully understand or take    into account, for instance, the complex relationships resulting from increased    use of automation combined with human involvement make for new types of hazards,    vulnerability, and </P>     <p> (VII)&nbsp;&nbsp;&nbsp;&nbsp;The speed of innovation &#151; and the diversity of innovations &#151; means  that there is less time to test and learn about the various &#147;frankensteins&#148;  and even less to find out about their potential interactions and risks  (Marais, Dulac and Leveson, 2004). </P>     <p> In sum, the ASD system conceptualization encompasses social structures,  human agents (individuals and collective), and physical artifacts as well  as the environment, especially the natural environment as well as their  interplay (in other analytic contexts, the social environment is included).  Understanding and reducing the risks associated with hazardous systems  entails (see Leveson, 2004; Leveson and others, 2009; and Marais, Dulac  and Leveson, 2004: </P> <UL>       <LI> identifying the multiple factors, including the many &#147;human factors&#148;      that are vulnerable or prone to breakdown or failure, resulting in accidents      that cause harm to life and property as well as to the environment; </LI>       <LI> establishing and maintaining appropriate controls and constraints &#151;      among other things, recognizing and taking measures to deal with breakdowns      in controls (through human and/or machine failings); </LI>       <LI> dealing with changes in and outside of the system which may increase hazards      or introduce new ones as well as increase the risks of accidents (this would      include monitoring such developments and making proactive preparations) (see      figure 2). </LI>     </UL>     ]]></body>
<body><![CDATA[<p>&nbsp; </P>     <p><B>Conclusions</B> </P>     <p> The ASD approach advocates the development of empirically grounded and  relevant theorizing, drawing on the powerful theoretical traditions of  social systems theory, institutionalism and cognitive sociology, for instance  in investigating the cognitive and normative bases of risk judgment and  management. It provides a social science theory when enables in a systematic  way the understanding, analysis, and control (that is, risk management)  of complex hazardous socio-technical systems in order to prevent or minimize  accidents, in particular, the role of &#147;human factors&#148;. </P>     <p> A social systems perspective re-orients us from away from reductionist  approaches toward more systemic perspectives on risk: social structure  including the institutional arrangements, the cultural formations, complex  socio-technical systems, etc. </P>     <p> Several key dimensions of risky system have been specified in this article,  for instance: (i) powerful systems may have high capacities to cause harm  to the physical and social environments (concerning, for instance, social  order, welfare, health, etc.); (ii) hierarchical systems where elites and  their advisors adhere with great confidence to, and implement, abstract  models. These models have, on the one hand, ideological or moral meanings  but entail, on the other, radical impacts on the social and physical environment.  Developments are decided by an elite certain of their values and truth  knowledge, ignoring or neglecting other value orientations and interests,  for instance, those of the populations dominated by the elite; (iii) agents  may be operating in a highly competitive context which drives them to initiate  projects and carry through social transformations which generate major  risks for the social and physical environments. In general, &#147;competitive  systems&#148; drive experimentation, innovation, and transformation (Burns and  Dietz, 2001). That is, institutional arrangements and related social processes  generate &#151; or at least tolerate the application of &#151; dangerous technologies  and socio-technical constructions. The forms of modern capitalism combining  hierarchical arrangements (the corporate structure of the firm) with institutional  incentive systems (&#147;the profit motive&#148;) entail such risky structural arrangements.  Economic agents are driven toward excessive exploitation of natural resources,  risky disposal of hazardous wastes, and risky technological developments  (exemplified by forms of advanced industrial agriculture which in Europe  resulted in the production and spread of &#147;mad-cow disease&#148;). Capitalism  and the modern state together with science and technology transform nature  (untamed, unknown) into an environment of resources which can be defined  in terms of prices, exchange-values and income flows. In very many areas,  this is a risky business (and indeed, the &#147;capitalist machine&#146;s&#148; ultimate  limitation is arguably the physical environment and available natural resources  as well as its capacity to absorb wastes) (Burns and others, 2002). Moreover,  capitalist wealth is often mobilized to attack and de-legitimize the criticisms  of, for instance, environmental movements and scientists that call for  limiting risky socio-economic developments as well as to buy off governments  that are under pressure to regulate; (iv) even if knowledge about the systems  is high (which may not be the case), it is bounded knowledge (Simon, 1979).  There will be unanticipated and unintended consequences which cause harm  or threaten to harm particular groups, possibly society as a whole, and  the environment; (v) all social systems including socio-technical systems,  are subject to internal and external drivers stressing and restructuring  the systems (not necessarily in appropriate or beneficial directions),  making for an increase in existing risks or the emergence of entirely new  risks; (vi) there is often a relatively low (but increasing) level of reflectivity  about the bounded knowledge, unpredictability, and problems of unanticipated  consequences. Feedback is low and deep social learning fails to take place,  for instance, with respect to some of the negative impacts on the social  and physical environments; (vii) such systems generate risks &#151; possibly  beyond the capacity of institutions to learn fast enough about and to reflect  on the changes, their value or normative implications, their likely or  possible implications for sustainability, and the appropriate strategies  for dealing with them. </P>     <p> Modern institutional arrangements and their practices combine different  types of selective processes, which may encourage (or at least fail to  constrain) risky behavior, having extensive and powerful impacts on the  physical and social environments, for instance, many industrial developments,  transport systems, nuclear power facilities, electrical networks, airports,  etc.). For instance, modern capitalism &#151; with its super-powers and mobilizing  capabilities &#151; generates a wide spectrum of risks. There are not only risky  technologies, which in an ever-increasingly complex world cannot be readily  assessed or controlled, but there are risky socio-technical systems of  production, generating risky technologies, production processes, products,  and practices (Machado, 1990; Perrow, 1994, 1999, 2004). In highly complex  and dynamic systems, some hazards cannot be readily identified, and probabilities  of particular outcomes cannot be determined and calculated. In other words,  some risks cannot be known and measured beforehand. This is in contrast  to cases of well-known, relatively closed technical systems, where one  can determine all or most potential outcomes and their probabilities under  differing conditions and, therefore, calculate and assess risks. Also,  some negative consequences arise because values of concern to groups and  communities are not taken into account &#151; as a result of the prevailing  power structure &#151; or are never anticipated because of the limitations of  the models utilized by power-wielders. </P>     <p> In conclusion, modern societies have developed and continue to develop  revolutionary economic and technological powers &#151; driven to a great extent  by dynamic capitalism &#151; at the same time that they have bounded knowledge  of these powers and their consequences. Unintended consequences abound:  social as well as ecological systems are disturbed, stressed, and transformed.  But social agents and movements form and react to these conditions, developing  new strategies and critical models and providing fresh challenges and opportunities  for institutional innovation and transformation. Consequently, modern capitalist  societies &#151; characterized by their core arrangements as well as the many  and diverse opponents to some or many aspects of capitalist development  &#151; are involved not only in a global struggle but a largely uncontrolled  experiment (or, more precisely, a multitude of experiments). The capacity  to monitor and to assess such experimentation remains severely limited,  even more so with the rapid changes in global capitalism. Because the capacity  to constrain and regulate global capitalism is currently highly limited,  the riskiness of the system is greatly increased, raising new policy and  regulatory issues. How is the powerful class of global capitalists to be  made responsible and accountable for their actions? What political or regulatory  forms and procedures might link the new politics suggested above to the  global capitalist economy and, thereby, increase the likelihood of effective  governance and regulation. </P>     <p> In the continuing human construction of different technologies and socio-technical  systems, there are many hazards and risks, some of them indeed very dangerous  and compromising of life on earth such as nuclear weapons, fossil fuel  systems bringing about radical climate change, etc. Humanity only knows  a small part of the mess they have been creating, and continue to create.  A great many of our complex constructions are not fully predictable or  understandable &#151; and not completely controllable. An important part of  the &#147;risk society, &#148; as we have interpreted it (Burns and Machado, 2009),  are the higher levels of risk consciousness, risk discourse, risk demands  and risk management. These have contributed &#151; and hopefully will continue  to contribute in the future &#151; to more effective control of risks, or eliminating  some types of risks altogether. This article has aimed to conceptualize  and enhance the regulation of the multiple &#147;dimensions&#148; and mechanisms  of the &#147;human factor&#148; in risky systems and cases of accidents. </P>     <p>&nbsp;</p>    <p> <B>References</B> </P>     ]]></body>
<body><![CDATA[<p> Baumgartner, T., and T. R. Burns (1984), <I>Transitions to Alternative Energy  Systems. Entrepreneurs, Strategies, and Social Change</I>, Boulder, Colorado,  Westview Press. </P>     <p> Baumgartner, T., T. R. Burns, and P. DeVille (1986), <I>The Shaping of Socio-Economic  Systems</I>, London, England, Gordon and Breach. </P>     <p> Beck, U. (1992), <I>Risk Society. Towards a New Modernity,</I> London, Sage Publications. </P>     <p> Buckley, W. (1967), <I>Sociology and Modern Systems Theory</I>, Englewood Cliffs,  NJ, Prentice-Hall. </P>     <p> Burns, T. R. (2006a), &#147;Dynamic systems theory&#148;, in Clifton D. Bryant and  D. L. Peck (eds.), The Handbook of 21st Century Sociology, Thousand Oaks,  California, Sage Publications. </P>     <p> Burns, T. R. (2006b), &#147;The sociology of complex systems: an overview of  actor-systems-dynamics&#148;, <I>World Futures. The Journal of General Evolution</I>,  62, pp.&nbsp;411-460. </P>     <p> Burns, T. R., T. Baumgartner, and P. DeVille (1985), <I>Man, Decision and  Society</I>, London, Gordon and Breach. </P>     <p> Burns, T. R., T. Baumgartner, and P. DeVille (2002), &#147;Actor-system dynamics  theory and its application to the analysis of modern capitalism&#148;, <I>Canadian  Journal of Sociology</I>, 27 (2), pp. 210-243. </P>     <p> Burns, T. R., T. Baumgartner, T. Dietz, and N. Machado (2002) &#147;The theory  of actor-system dynamics: human agency, rule systems, and cultural evolution&#148;,  in <I>Encyclopedia of Life Support Systems</I>, Paris, UNESCO. </P>     <p> Burns, T. R., and M. Carson (2002), &#147;Actors, paradigms, and institutional  dynamics&#148;, in R. Hollingsworth, K. H. Muller, E. J. Hollingsworth (eds.),  <I>Advancing Socio-Economics. An Institutionalist Perspective</I>, Oxford, Rowman  and Littlefield. </P>     ]]></body>
<body><![CDATA[<p> Burns, T. R., and P. DeVille (2003), &#147;The three faces of the coin: a socio-economic  approach to the institution of money&#148;, <I>European Journal of Economic and  Social System</I>, 16 (2), pp. 149-195. </P>     <p> Burns, T. R., and T. Dietz (1992a) &#147;Cultural evolution: social rule systems,  selection, and human agency&#148;, <I>International Sociology</I>, 7, pp. 259-283. </P>     <p> Burns, T. R., and T. Dietz (1992b), &#147;Technology, sociotechnical systems,  technological development: an evolutionary perspective&#148;, in M. Dierkes  and U. Hoffman (eds.), <I>New Technology at the Outset. Social Forces in the  Shaping of Technological Innovations</I>, Frankfurt/Main, Campus. </P>     <p> Burns, T. R., and T. Dietz (2001) &#147;Revolution: an evolutionary perspective&#148;,  <I>International Sociology</I>, 16, pp. 531-555. </P>     <p> Burns, T. R., and H. Flam (1987), <I>The Shaping of Social Organization. Social  Rule System Theory and its Applications</I>, London, Sage Publications. </P>     <!-- ref --><p> Burns, T. R., and N. Machado (2009) &#147;Technology, complexity, and risk:  part ii: social systems perspective on socio-technical systems and their  hazards&#148;, <I>Sociologia, &nbsp;Problemas e Pr&#225;ticas</I>, Forthcoming. &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000148&pid=S0873-6529200900030000200001&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><P> Carson, M., T. R.    Burns, and D. Calvo (eds.) (2009), <I>Public Policy Paradigms. Theory and Practice    of Paradigms Shifts in the EU</I>, Frankfurt/Berlin/Oxford, Peter Lang, in process.    </P>     <p> Editorial (2000), &#147;Emerging infections: another warning&#148;, <I>New England Journal  of Medicine</I>, 342 (17), p. 1280. </P>     <p> Fowler, C. (1998), &#147;Background and current and outstanding issues of access  to genetic resources&#148;, <I>ESDAR Synthesis Report</I>. </P>     <p> Hammer, C. (2001), &#147;Xenotransplantation: perspectives and limits&#148;,<I> Blood  Purification</I>, 19, pp. 322-328. </P>     ]]></body>
<body><![CDATA[<p> Kerr, A., and S. Cunningham-Burley (2000), &#147;On ambivalence and risk: reflexive  modernity and the new human genetics&#148;, <I>Sociology</I>, 34, pp. 283-304. </P>     <p> LaPorte, T. R. (1978), &#147;Nuclear wastes: increasing scale and sociopolitical  impacts&#148;, <I>Science</I>, 191, pp. 22-29. </P>     <p> LaPorte, T. R. (1984), &#147;Technology as social organization&#148;, Working Paper,  84 (1), (IGS Studies in Public Organization), Berkeley, California, Institute  of Government Studies. </P>     <p> LaPorte, T. R., and P. M. Consolini (1991), &#147;Working in practice but not  in theory: theoretical challenges of &#145;high reliability organizations&#146;&#148;,  <I>Journal of Public Administration Research and Theory,</I> 1, pp. 19-47. </P>     <p> Leveson, N. (2004), &#147;A new accident model for engineering safer systems&#148;,  <I>Safety Science</I>, 42 (4), pp. 237-270. </P>     <p> Leveson, N., N. Dulac, K. Marais, and J. Carroll (2009), &#147;Moving beyond  normal accidents and high reliability organizations: a systems approach  to safety in complex systems&#148;, <I>Organization Studies</I>, 30 (2-3), pp. 227-249 </P>     <p> Machado, N. (1990), &#147;Risk and risk assessments in organizations: the case  of an organ transplantation system&#148;, Presented at the XII World Congress  of Sociology, July 1990, Madrid, Spain. </P>     <p> Machado, N. (1998), <I>Using the Bodies of the Dead. Legal, Ethical and Organizational  Dimensions of Organ Transplantation</I>, Ashgate Publishers, Aldershot, England. </P>     <p> Machado, N. (2005), &#147;Discretionary death: cognitive and normative problems  resulting from advances in life-support technologies&#148;, <I>Death Studies</I>, 29  (9), pp. 791-809. </P>     <p> Machado, N. (2007), &#147;Race and medicine&#148;, in T. R. Burns, N. Machado, Z.  Hellgren, and G. Brodin (eds.), <I>Makt, kultur och kontroll &#246;ver invandrares  livsvillkor</I>, Uppsala, Uppsala University Press. </P>     ]]></body>
<body><![CDATA[<p> Machado, N. (2009), &#147;Discretionary death&#148;, in C. Bryant and D. L. Peck  (eds.) <I>Encyclopedia of Death and the Human Experience</I>, London/Beverley  Hills, Sage Publications. </P>     <p> Machado, N., and T. R. Burns (2001), &#147;The new genetics: a social science  and humanities research agenda&#148;, <I>Canadian Journal of Sociology</I>, 25 (4),  pp. 495-506. </P>     <p> Marais, K., N. Dulac, and N. Leveson (2004), &#147;Beyond normal accidents and  high reliability organizations: the need for an alternative approach to  safety in complex systems&#148;,<I> MIT Report</I>, Cambridge, Mass. </P>     <p> Perrow, C. (2004), &#147;A personal note on normal accidents&#148;, <I>Organization  &amp; Environment</I>, 17 (1), pp. 9-14. </P>     <p> Perrow, C. (1999), <I>Normal Accidents. Living with High-Risk Technologies</I>,  2nd ed., Princeton, N. J.: Princeton University Press; New York: Basic  Books (Originally published in 1984 by Basic Books, New York). </P>     <P> Perrow, C. (1994),    &#147;The limits of safety: the enhancement of a theory of accidents&#148;,    <I>Journal of Contingencies and Crisis Management</I>, 2 (4), pp. 212-220. </P>     <p> Rosa, E. A. (2005), &#147;Celebrating a citation classic: and more&#148;, <I>Organization  &amp; Environment</I>, 18 (2), pp. 229-234 </P>     <p> Rosenberg, N. (1982), <I>Inside the Black Box. Technology and Economics</I>, Cambridge,  Cambridge University Press. </P>     <p> Simon, H. A. (1979), <I>Models of Thought,</I> New Haven, Yale University Press. </P>     <p> Thompson, L. (2000), &#147;Human gene therapy harsh lessons, high hopes&#148;, <I>FDA  Consumer Magazine</I>, September-October, pp. 19-24. </P>     ]]></body>
<body><![CDATA[<p> Vaughan, D. (1999), &#147;The dark side of organizations: mistake, misconduct,    and disaster&#148;, <I>Annual Review of Sociology</I>, 25, pp. 271-305. </P>     <p>&nbsp;</P>     <p><a href="#top1">1</a> <a name="1"></a>This article&#8212;to appear in two parts&#8212;draws    on an earlier paper of the authors presented at the Workshop on &#8220;Risk    Management&#8221;, jointly sponsored by the European Science Foundation (Standing    Committee for the Humanities) and the Italian Institute for Philosophical Studies,    Naples, Italy, October 5-7, 2000. It was also presented at the European University    Institute, Florence, Spring, 2003. We are grateful to Joe Berger, Mary Douglas,    Mark Jacobs, Giandomenico Majone, Rui Pena Pires, and Claudio Radaelli and participants    in the meetings in Naples and Florence for their comments and suggestions.</P>     <p><a href="#top2">2</a><a name="2"></a> Action is also constrained and facilitated    by the responses of others who have the power to positively or negatively sanction,    to persuade or inform. In other words, the agency of some actors affects the    ability of other actors to exercise their own agency. In the extreme, powerful    actors can severely restrict the agency of others in selected domains of social    life.</P>     <p><a href="#top3">3</a> <a name="3"></a>Agency can function also on structural    levels, operating upon institutional frameworks, socio- technical systems, and    societal arrangements, that is, the exercise of meta-power (IV-A and IV-B).    The exercise of meta-power involves, among other things, establishing incentives    structures and opportunity and constraining structures for agents who have or    potentially have dealings with one another (Burns, Baumgartner and DeVille,    1985). Meta-power actors are structuring, allocating, selecting inways that    maintain (or reproduce) and change social structures but also impact to a greater    or lesser extent on the physical environment and ecosystems.</P>     <p><a href="#top4">4</a><a name="4"></a> Particular types of markets, super-powerful    transnational corporations, regulative regimes, and many new technologies being    introduced are complex, dynamic systems entailing a variety of different risks.</P>     <p><a href="#top5">5</a><a name="5"></a> World Wide Web developments provide other    examples. Internet was initially developed by academicians. Later, the idea    emerged and spread of the usefulness of internet for multiple purposes. It was    expected: that it would function as a pure source of unlimited information and    knowledge development; that small companies and cooperatives could gain from    safe access to global networks free and ideal transcultural exchange and learning    could take place. But the emerging reality was somewhat different, a mixed picture.    Among the unanticipated developments: internet as a zone of risk (e.g., from    hackers) the spread of misleading information. For instance, someone may report    an accident or political crisis. Others naively spread this initial report,    making for a process with a non-rational life of its own; criminal activity    such as child pornography, deceitful schemes, etc.; violent political groups,    neo-nazis, terrorists, etc.</P>     <p><a href="#top6">6</a><a name="6"></a> Already there have occurred scandals    and public issues. In September 1999, therewas a scandal with the death of a    18 year old patient, Jesse Gelsinger from a reaction to gene therapy at the    University of Pennsylvania Institute of Human Gene Therapy. Following this case,    further inspections conducted by the FDA of gene therapy trials resulted in    the closure and suspension of several of those clinical trials. The reason:    in many of those trials the researcherswere not reporting the serious adverse    events suffered by research subjects. Less than 5% (37 of 970) of the serious    adverse events in these gene therapies were reported (see Thompson, 2000). Thus,    in addition to demand additional reporting requirements from research groups    and to promote Gene Transfer Safety symposia&#8212;and in order to restore public    confidence, the FDAhas proposed that all researchers doing human trials of gene    therapy and xenotransplantation will post &#8220;safety&#8221; information about    the clinical trials, such as side effects, adverse reactions etc. in the FDAweb    page (<a href="http://www.fda.gov" target="_blank">www.fda.gov</a>).</P>     <p> <a href="#top7">7</a> <a name="7"></a>Such tests are useful to employers,    insurance companies, and health authorities andwelfare services, since they    would allow them to limit their liabilities, target future services for specific    &#8216;at risk&#8217; groups, and emphasize personal responsibility for disease    alleviation and prevention (Kerr and Cunningham-Burley, 2000: 289).</P>     <p> <a href="#top8">8</a> <a name="8"></a>A new biological language is being developed.    It is more politically correct than older languages such as that of &#8220;racism.    &#8221; Also, cognitive abilities&quot; replaces the older and politically fraught    concept of &#8220;intelligence. &#8221; New, broader continuums of disease have    been established (such as the &#8220;schizophrenic spectrum&#8221; (Kerr and    Cunningham-Burley, 2000: 296).</P>     ]]></body>
<body><![CDATA[<p><a href="#top9">9</a><a name="9"></a> &#8220;Techno-Utopianism&#8221;; &#8220;Who    plays God in the 21st century?&#8221; See Turning Point Project. <a href="http://www.turnpoint.org/" target="_blank">http://www.turnpoint.org/</a></P>     <p> <a href="#top10">10</a><a name="10"></a> Increased outbreaks of infectious    diseases are associated with animal herds (pigs, cattle, chickens). An important    factor in these outbreaks is the increasing industrialization of animal-food    production in many areas of theworld that has propelled the creation of large-scale    animal farms keeping substantial number of pigs or chicken for example, in concentrated    spaces. These conditions are commonly associated with a number of infectious    outbreaks and diseases in the animal population, many of them a threat to human    populations. Not surprisingly, this also explain the widespread use of antibiotics    in order to avoid infections and to stimulate growth in these animal populations    (increasing the risk of antibiotic resistant infections in humans) (Editorial,    2000) . Today, an increased proportion of the fruits and vegetables consumed    in highly developed countries is grown and processed in less technologically    developed countries. The procedures to process food (e.g., pasteurization, cooking,    canning) normally ensure safe products. However, these processing procedures    can fail and sometimes do. One defective product may contaminate a number of    individuals spread in different countries with a global food supply we encounter    the risk that (see Editorial, 2000). The existing nationally or regionally based    health care infrastructures are not prepared to handle these problems. Earlier,    people were infected by food and drink, locally produced and locally consumed.    We see here, in connection with technological developments, the differences    between exogenous dangers and risks as opposed to endogenous dangers and risks.</P>     <p><a href="#top11">11</a><a name="11"></a> This section draws on Burns and others    (2002); also, see Burns and Dietz (1992b).</P>     <p><a href="#top12">12</a> <a name="12"></a>In non-industrial and especially small-scale    societies, most &#8220;system&#8221; development, including technological development,    entail a substantial amount of trial and error innovation. Indeed, there is    probably a direct correlation between the scale of a society and the degree    to which system innovation and development depends on experimentation rather    than on theory. The result is that the models and much of the knowledge that    guide the development and use of human constructions, including technology,    tend to be rather ad-hoc and empirically based, with limited invocation of theoretical    generalizations. In the modern world, and probably in most large scale societies,    the systems constructed, including technological systems, often are designed    not on the basis of specific models developed inductively by experimentation    with prototypes, but rather from application of the rules that constitute scientific,    engineering, and managerial laws or other knowledge systems which contain their    own meta-rules about forms of evidence, generalization, inference and so on.    While this set of generalizations has allowed a vast expansion of system development,    it also results in problems associated with the limits of such models and of    de-contextualized knowledge in general.</P>     <p><a href="#top13">13</a> <a name="13"></a>Such environments may be generated    in part through the very application of the technology.</P>     <p><a href="#top14">14</a><a name="14"></a> Yet, the foreign ministry typically    lacks the technical expertise of the specialist ministries, and this is one    of the grounded driving competition among ministries from the same country.</P>     <p><a href="#top15">15</a> <a name="15"></a>Like Perrow (1999) and Rosa (2005),    our aproach emphasizes social systems, organizations, and institutions as key    objects of analysis in determining risk and catastrophe. The field of risk analysis    has been, and continues to be, dominated by psychological reductionism (Rosa,    2005: 230).</P>     <p><a href="#top16">16</a> <a name="16"></a>We cannot elaborate here on this concept    of causality but only to point out that a major &#8220;causal factor&#8221;    identified by Leveson (2004) is that of design which has a proper place in the    Pantheon of causal analysis, although it is not usually referred to, at least    in the social sciences. But in our systems perspective, it is a major factor.</P>     <p><a href="#top17">17</a><a name="17"></a> This fundamental aspect has been investigated    by Burns and Deville (2003) with respect to money and financial systems and    by Machado in high-tech medicine (in particular, the New Genetics in medicine    (Machado and Burns, 2001) and explains why regulatory instruments and regulatory    institutions ultimately fail with respect to the dynamics and innovativeness    of human agents and their many splendid creations.</P>     <p>&nbsp;</P>     ]]></body>
<body><![CDATA[<p><a href="#topa1">*</a> <a name="a1"></a>Tom R. Burns. Lisbon University Institute,    Lisbon, Portugal. Woods Institute for the Environment, Stanford University,    Stanford, Calif. Uppsala Theory Circle, Department of Sociology, University    of Uppsala, Box 821, 75108 Uppsala, Sweden. E-mail: <a href="mailto:tomburns@stanford.edu">tomburns@stanford.edu</a>.  </P>     <p> <a href="#topa2">**</a> <a name="a2"></a>Nora Machado. CIES, Lisbon University    Institute, Lisbon, Portugal. Science, Technology, and Society Program, Stanford    University, Stanford, Calif. Department of Sociology, University of Gothenburg,    Gothenburg, Sweden (on leave-of-absence). E-mail: <a href="mailto:noramachado@gmail.com">noramachado@gmail.com</a></P>      ]]></body><back>
<ref-list>
<ref id="B1">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Burns]]></surname>
<given-names><![CDATA[T. R.]]></given-names>
</name>
<name>
<surname><![CDATA[Machado]]></surname>
<given-names><![CDATA[N.]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Technology, complexity, and risk: part ii: social systems perspective on socio-technical systems and their hazards]]></article-title>
<source><![CDATA[Sociologia, Problemas e Práticas]]></source>
<year>2009</year>
</nlm-citation>
</ref>
</ref-list>
</back>
</article>
