Martin & Rose (2007: 331-2):
From our own vantage point, there have been some interesting shifts of focus in discourse analysis over the four decades of Jim’s involvement and two decades of David’s. In the 1970s, cohesion was the favoured episteme, as grammarians cast their gaze outwards beyond the clause. In the 1980s it was genre that came to the fore, fostered in important respects by work on literacy development in the Sydney School, English for Academic Purposes and New Rhetoric traditions (Hyon 1996). The 1990s saw the emergence of evaluation as a major theme, as analysts developed models of attitude in functional and corpus linguistics (Hunston and Thompson 2000, Martin and Macken-Horarik 2003). Currently we are in the midst of a surge of interest in multimodal discourse analysis, inspired by the ground-breaking work of Kress and van Leeuwen (1996/2006, 2001) on images. Looking ahead, we can probably expect an emerging rapprochement between qualitative and quantitative approaches to text analysis, depending on the kinds of technology that can be brought to bear in large-scale studies of many and longer texts. Just how this will tend to focus discourse analysis epistemes is harder to predict. Our own approach, in this book and beyond, contrasts strikingly with current trends, which for operational reasons (or worse) tend to elide discourse semantics in favour of word counts, collocations and colligations — as if texts where random sequences of words, phases or clauses. As analysis technologies develop, we need to ensure these trends do not become entrenched in the field in the long term.
Blogger Comments:
[1] To be clear, in SFL Theory, discourse analysis is the use of linguistic theory (potential) to analyse texts (instances of language).
[2] This is misleading. Four decades before the first edition of this book Jim Martin turned 13 years old. David Rose, on the other hand, completed his PhD — describing an Australian language — only 5 years before the first edition of this book.
[3] On the one hand this is misleading, and on the other hand, this misunderstands the term 'episteme'. Firstly, it is misleading because it falsely claims that discourse analysis was restricted to cohesive analysis, which is merely the non-structural component of the textual metafunction. Discourse analysis potentially involves all of theory, but crucially, it deploys the grammar. As Halliday (1985: xvii) made clear:
A discourse analysis that is not based on grammar is not an analysis at all, but simply a running commentary on a text… the exercise remains a private one in which one explanation is as good or as bad as another.
Secondly, the use of 'episteme' is inconsistent with its use in philosophy, whether by Plato or Foucault, the latter being the more likely source for Martin & Rose. For Foucault, 'épistémè' means the epistemological assumptions on which meaning-making is based, as exemplified by the SFL Theory assumption that meaning is immanent of semiotic systems, not transcendent of them.
More generally, 'episteme' means a principled system of understanding, such as a science, and it contrasts with 'techne', an applied practice. In talking of the practice of discourse analysis, Martin & Rose are concerned with techne, not episteme. However, Plato distinguishes 'episteme' from 'doxa', common belief or opinion. In tracing a history of "favoured epistemes", Martin & Rose are actually concerned with doxa (with regard to techne).
[4] This is misleading. Here Martin & Rose are presenting Martin's personal trajectory of interests as if they were the prevailing trends in the worldwide SFL community as a whole.
[5] This is misleading, because here Martin & Rose fallaciously argue for the value of their own approach, discourse semantics, by contrasting it with a straw man: non-existent Systemic Functional linguists who analyse texts as if they "random sequences of words, phases or clauses."
Firstly, as this blog and the review of Martin (1992) have demonstrated, Martin's model of discourse semantics is his misunderstandings of Halliday's speech function (interpersonal semantics) and Halliday & Hasan's cohesion (non-structural textual lexicogrammar), rebranded as his own systems, and as such, as Martin's theoretical insights.
Secondly, in SFL Theory, the quantitative approach is used to distinguish texts according to register. Texts differ by the frequencies of feature selection, across all systems of the content plane, and these frequencies instantiate the probabilities of feature selection by which registers of language differ. This is, of course, all lost on those who cannot distinguish registers of language from the cultural context of language, such as Martin & Rose.
No comments:
Post a Comment