To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

From Wikipedia, the free encyclopedia

part of a causal map showing how Factor B causally influences Factor C
Part of a causal map showing how Factor B causally influences Factor C

A causal map can be defined as a network consisting of links or arcs between nodes or factors, such that a link between C and E means, in some sense, that someone believes or claims C has or had some causal influence on E.

This definition could cover diagrams representing causal connections between variables which are measured in a strictly quantitative way and would therefore also include closely related statistical models like Structural Equation Models[1] and Directed Acyclic Graphs (DAGs).[2] However the phrase “causal map” is usually reserved for qualitative or merely semi-quantitative maps. In this sense, causal maps can be seen as a type of concept map. Systems diagrams and Fuzzy Cognitive Maps [3] also fall under this definition. Causal maps have been used since the 1970’s by researchers and practitioners in a range of disciplines from management science [4] to ecology,[5]  employing a variety of methods. They are used for many purposes, for example:

  • As sketch diagrams to summarise causal links [6]
  • As tools to understand how decisions are made [7]
  • As tools to assist strategic planning [8]
  • As tools to form and represent a consensus of expert views on “what causes what” in a subject area [9]
  • As tools to investigate the differences in how different subjects view causal links in a subject area [10]
  • As a way to encode the separate views of many different respondents on “what causes what” in a subject area [11]
  • To represent “theories of change” [12] and “program theory” [13] in project management and evaluation

Different kinds of causal maps can be distinguished particularly by the kind of information which can be encoded by the links and nodes. One important distinction is to what extent the links are intended to encode causation or (somebody’s) belief about causation.

Causal mapping

Causal mapping is the process of constructing, summarising and drawing inferences from a causal map, and more broadly can refer to sets of techniques for doing this. While one group of such methods is actually called “causal mapping”, there are many similar methods which go by a wide variety of names.

The phrase “causal mapping” goes back at least to Robert Axelrod,[7] based in turn on Kelly’s personal construct theory .[14] The idea of wanting to understand the behaviour of actors in terms of internal ‘maps’ of the word which they carry around with them goes back further, to Kurt Lewin [15] and the field theorists.[16] Causal mapping in this sense is loosely based on "concept mapping" and “cognitive mapping”, and sometimes the three terms are used interchangeably, though the latter two are usually understood to be broader, including maps in which the links between factors are not necessarily causal and are therefore not causal maps.

Literature on the theory and practice of causal mapping includes a few canonical works[7] as well as book-length interdisciplinary overviews,[17][18] and guides to particular approaches.[19]

Cause–effect graph

In software testing, a cause–effect graph is a directed graph that maps a set of causes to a set of effects. The causes may be thought of as the input to the program, and the effects may be thought of as the output. Usually the graph shows the nodes representing the causes on the left side and the nodes representing the effects on the right side. There may be intermediate nodes in between that combine inputs using logical operators such as AND and OR.

Constraints may be added to the causes and effects. These are represented as edges labeled with the constraint symbol using a dashed line. For causes, valid constraint symbols are E (exclusive), O (one and only one), I (at least one), and R (Requires). The exclusive constraint states that at most one of the causes 1 and 2 can be true, i.e. both cannot be true simultaneously. The Inclusive (at least one) constraint states that at least one of the causes 1, 2 or 3 must be true, i.e. all cannot be false simultaneously. The one and only one (OaOO or simply O) constraint states that only one of the causes 1, 2 or 3 must be true. The Requires constraint states that if cause 1 is true, then cause 2 must be true, and it is impossible for 1 to be true and 2 to be false.

For effects, valid constraint symbol is M (Mask). The mask constraint states that if effect 1 is true then effect 2 is false. Note that the mask constraint relates to the effects and not the causes like the other constraints.

The graph's direction is as follows:

Causes --> intermediate nodes --> Effects

The graph can always be rearranged so there is only one node between any input and any output. See conjunctive normal form and disjunctive normal form.

A cause–effect graph is useful for generating a reduced decision table.

See also

List of Causal Mapping Software

References

  1. ^ Clogg, Clifford C.; Bollen, Kenneth A.; Long, J. Scott (1993). "Testing Structural Equation Models". Social Forces. 73 (3): 1161. doi:10.2307/2580595. ISSN 0037-7732. JSTOR 2580595.
  2. ^ Pearl, J; Mackenzie, D (2018). "The Book of Why: The New Science of Cause and Effect". Journal of the American Statistical Association. 115 (529): 482–485. arXiv:2003.11635. doi:10.1080/01621459.2020.1721245. ISSN 0162-1459. S2CID 213366968.
  3. ^ Kosko, 1986
  4. ^ Bougon, Michel; Weick, Karl; Binkhorst, Din (1977). "Cognition in Organizations: An Analysis of the Utrecht Jazz Orchestra". Administrative Science Quarterly. 22 (4): 606. doi:10.2307/2392403. ISSN 0001-8392. JSTOR 2392403.
  5. ^ Moon, Katie; Guerrero, Angela M.; Adams, Vanessa. M.; Biggs, Duan; Blackman, Deborah A.; Craven, Luke; Dickinson, Helen; Ross, Helen (2019-03-07). "Mental models for conservation research and practice". Conservation Letters. 12 (3). doi:10.1111/conl.12642. ISSN 1755-263X.
  6. ^ Alan., Murray, Charles. Investment and tithing in Thai villages: a behavioral study of rural modernization. OCLC 24819834.{{cite book}}: CS1 maint: multiple names: authors list (link)
  7. ^ a b c Robert, Axelrod (1976). Structure of Decision : the Cognitive Maps of Political Elites. Princeton University Press. ISBN 978-1-4008-7195-7. OCLC 949946348.
  8. ^ Reynolds, Martin; Holwell, Sue, eds. (2010). Systems Approaches to Managing Change: A Practical Guide. Bibcode:2010satm.book.....R. doi:10.1007/978-1-84882-809-4. ISBN 978-1-84882-808-7.
  9. ^ Barbrook-Johnson, Pete; Penn, Alexandra (2021). "Participatory systems mapping for complex energy policy evaluation". Evaluation. 27 (1): 57–79. doi:10.1177/1356389020976153. ISSN 1356-3890. S2CID 231624497.
  10. ^ Laukkanen, Mauri; Wang, Mingde (2016-03-03). Comparative Causal Mapping. doi:10.4324/9781315573038. ISBN 9781315573038.
  11. ^ Copestake, J; Remnant, F (2019). "Generating credible evidence of social impact using the Qualitative Impact Protocol (QuIP): the challenge of positionality in data coding and analysis". {{cite journal}}: Cite journal requires |journal= (help)
  12. ^ Davies, Rick (2004). "Scale, Complexity and the Representation of Theories of Change". Evaluation. 10 (1): 101–121. doi:10.1177/1356389004043124. ISSN 1356-3890. S2CID 62169076.
  13. ^ Huey-Tsyh., Chen (1990). Theory-driven evaluations. Sage Publications. ISBN 0-8039-3532-3. OCLC 611218200.
  14. ^ Kelly, G (1955). "Beneath the mask. An introduction to theories of personality". Personality and Individual Differences. 2 (4): 356. doi:10.1016/0191-8869(81)90099-4. ISSN 0191-8869.
  15. ^ Lewin, K. (1982). Force field analysis
  16. ^ Tolman, E. C. (1948). Cognitive maps in rats and men. Psychological Review, 55(4), 189.
  17. ^ Huff, A.S (1990). "Mapping strategic thought". Long Range Planning. 24 (2): 123. doi:10.1016/0024-6301(91)90132-8. ISSN 0024-6301.
  18. ^ Narayanan, V.K.; Armstrong, Deborah J., eds. (2005). Causal Mapping for Research in Information Technology. IGI Global. doi:10.4018/978-1-59140-396-8. ISBN 978-1-59140-396-8.
  19. ^ Powell, S; Remnant, F; Avard, R; Goddard, S (2021). "Guide to Causal Mapping".

Further reading

This page was last edited on 26 September 2023, at 22:06
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.