Diocesan Missal Survey report goes live

From 1 January to 30 April 2012 the Diocese of Portsmouth conducted a survey on the new texts. Replies were received during this period in written form and as email attachments. The survey form was posted on the diocesan website, and was accessed by people from outside the diocese as well as from within it.

Processing of the responses, originally intended to be completed by the end of July, was delayed by pressure of other work connected with the arrival of a new Bishop in the diocese in September.

The complete report has been submitted to the Bishops’ Conference of England and Wales via its Liturgy Office, and to ICEL, and is now available to download as a PDF from the link on the following page:
http://www.portsmouthdiocese.org.uk/liturgy

21 comments

  1. Positive and very positive responses total 29.32%, leaving over 70% not positive.
    Specifically negative and very negative: 49.19%, almost half the respondents.
    It seems fair to conclude from this that a majority of the Catholic population is unhappy with their prayers and responses.

    I suppose if “unhappy” means “not happy” (i.e. including “neutral”) this would be true. But that is not, I think, the usual meaning of “unhappy.”

  2. Concerning the language of the people’s responses and prayers, a panoply of adjectives and descriptors that would be the envy of Roget’s Thesaurus is wheeled into line:

    alien, antiquated, arcane, archaic, argumentative, artificial, awful, awkward, bad English, bland, clumsy, clunky, complex, just too complicated, confusing, contrived, convoluted, cumbersome, difficult, disjointed, vastly disappointing, distancing, does not flow, doesn’t make sense, does not scan, dreadful, impenetrable, inaccessible, inappropriate, incomprehensible, inconsistent, incorrect, lacking in lightness, lacking in simplicity, Latinate, Latinized, less meaningful, less spontaneous, literal, long, non-descriptive, not dynamic, not current English, seems not designed for oral recitation, not easy to say or sing, not gender-inclusive, not smooth, not user-friendly, not vernacular English, (wilfully) obscure, obscurantist, very odd, too old-fashioned, opaque, over-the-top, pedantic, plodding, pompous, poor, rather poor, very poor, “sounds like a translation”, stilted, rather stilted, (very) strange, tortuous, trivialised, ugly, ungrammatical, unmoving, unnatural, unnecessary, unpoetic, unreal, unrhythmical, unwelcome, unwelcoming, verbose, Victorian, wooden, wordy…

    I love this list.

    Contrary to what many people believe (including some social scientists) opened ended responses have been shown to be far more reliable and valid than rating scales. Opened ended responses are not often used because organizing them in some fashion (i.e. coding them) takes time. The coding system may be arbitrary. In this case the alphabetical list really accomplished the task well.

  3. Here’s the big question:
    How come the US Catholic survey was so negative, CARA was mixed but rather positive, and this report is so negative?
    CARA is most scientific; USC isn’t scientific at all; Portsmouth isn’t totally scientific but seems to be quite representative.
    What’s going on?
    awr

    1. @Anthony Ruff, OSB – comment #3:
      I am curious what you mean by ‘representative’, Fr. Anthony, and what makes the Portsmouth survey representative. (That’s an honest curiosity question, not a subtle jab at the validity of the Portsmouth results).

      My (unsupported, completely personal) hypothesis is the following:

      — People who are extremely unhappy with the new translation, and who are sufficiently well-versed about or interested in liturgical matters/the translation controversy/etc. (as evidenced by responses citing Jansenism, etc.) will look for an outlet to voice their frustration. Unlike Portsmouth, many dioceses do not provide this, but others like the USC, the Tablet, etc. do, and people go to them.

      — People who aren’t sufficiently interested in liturgical matters may be happy or unhappy or couldn’t care less; either way, they are the least likely to respond when information for a poll is sought voluntarily. I think this describes the majority of people in the pews.

      — People who support the new translation often (though not always) come from a particular ideological camp, and are currently satisfied, or cock-a-hoop that they’ve ‘won’, and have less reason to express their feelings in any polls, unless they feel it their bounden duty to mitigate the reaction of the negatives.

      1. @Joshua Vas – comment #7:
        Joshua,
        You ask a fair question. I have the impression – it’s nothing more than that – that the diocese made an effort to give everyone a chance to respond, and there wasn’t an effort from the outside (eg from Fr. Z or from PTB) to bombard the site with responses from one sector to skew it, and they say that most of the responses were from within the diocese. So that’s why I think it’s somewhat representative. But maybe it really isn’t scientifically.
        Your hypotheses sound pretty right to me.
        awr

    2. @Anthony Ruff, OSB – comment #3:

      The short answer is pretty much the same for differences in all surveys:1) differences in the wording and format of the questions and therefore how they are interpreted by the respondent, researchers and readers, and 2) differences in the sample and therefore the hypothetical population to which we might generalize the results.

      Lets take the questions first since their reliability and validity are key to the success of the survey. Reliability is high when you get a similar answer from the respondent over time or when asking the question in a different way not influenced much by how respondents interpret the question, their mood etc. “Opened ended” responses are generally more reliable since respondents don’t have to figure out how to fit themselves into the researcher’s categories.

      Respondents attempt to figure out what the question is really about. Slight variations can result in major changes, e.g. questions asked by a Catholic diocese or research organization can elicit very different answers than questions asked by the Gallup poll, or even a survey by the Gallup about religion in general or a survey by Gallup about Catholicism. People make guesses about what that the survey is about and respond accordingly.

      That gets to the whole issue of what a particular question is really measuring (validity)? Is it taping into people’s general tendency to give someone or something the benefit of the doubt, to make themselves look good, to be religious, to be a good parish member, etc.

      Questions become more valid to the degree they reflects attitudes toward the New Missal in particular (as opposed to the old Missal) rather than the Mass in general (whether people like the homily and the hymns), or the parish in general, or Catholicism in general.

      So far the survey questions have not allowed us to compare satisfaction between the Old and New Missals, or with homilies, hymn choices, congregational singing, choral singing, cantors, prayers, EPs their parish, the church, the bishops, etc.

      We need a Vibrant Parish Liturgy Study to determine the importance and well doneness of a whole list of things, and then see where the new Missal fits.

    3. @Anthony Ruff, OSB – comment #3:

      2. differences in the sample and therefore the hypothetical population to which we might generalize the results

      My favor study, the Vibrant Parish Life Study here in the Diocese of Cleveland of more than 46,000 respondents in 126 participating parishes was not a random sample study. I think the 126 parishes were self selected not randomly sampled. In my local parish about 250 respondents were selected randomly but more than 750 respondents filled out the survey during homily time at Mass. Obviously the sample is biased and over represents parishes that are more involved in diocesan planning processes and people who are more involved in their parishes, and the results might be peculiar to the Cleveland diocese. However with so many parishes, and so many respondents I have a lot more faith in it than in many “more scientific” studies. My faith has been helped by the fact that my local parish results were very similar to the overall results even though I would not have predicted that. I spent four years on pastoral council viewing the parish through the lens of that study.

      Same Call Different Men had a response rate of 30% which is considerably less than the 68% to 89% response rates in prior studies which used a two stage process that sampled dioceses and religious orders then priests within them. Those studies had the advantage the bishops and superiors encouraged participation. So people could argue that 70% of the priests where so demoralized (e.g. by the sex abuse scandal) that they did not even respond. However the pattern of responses this time is so similar to the previous studies that this is highly unlikely

      Vast numbers of psychological studies are done on college students. Narrowly speaking their results should not be generalized to all adults but practically speaking they are, unless some researcher has a good theory as to why the results are limited to college students, and demonstrates that in subsequent research.

      There is a lot more face validity to US Catholic and Portsmouth studies than to the CARA study because of the extensive open ended comments. They surely represent the opinions of some people. The question is who are these people and how many other people have similar opinions.

    4. @Anthony Ruff, OSB – comment #3:
      “What’s going on?”

      Actually, such a disparity it not at all unusual–between the results of a survey with an inadequate number of non-randomly selected respondents, and the results of a “scientific” survey with a sufficient number of randomly selected respondents to make reliable representation of the larger population statistically probable.

  4. I think that many times unscientific polls attract people with a keen interest, one way or another, in the given subject, people with an ax to grind.

    My Ax: Love the TLM and am generally favorable to subordinate clauses, even clunky ones, over the 1970 translation. So, if I hear of a survey I will seek it out if I have the time. Add to that, at least in the Anglophone world, people like to voice complaints to some official something or other.

  5. Christopher Douglas : I think that many times unscientific polls attract people with a keen interest, one way or another, in the given subject, people with an ax to grind. My Ax: Love the TLM and am generally favorable to subordinate clauses, even clunky ones, over the 1970 translation. So, if I hear of a survey I will seek it out if I have the time. Add to that, at least in the Anglophone world, people like to voice complaints to some official something or other.

    Exactly. As someone with a background in Economics (and therefore Statistics/Econometrics) I point out that you *always* have to take into account sampling bias; some people want to be interviewed and others don’t. People who feel “Meh.” don’t have anything to say either way.

  6. Adding the “Negative” and “Very Negative” responses for the appropriate survey items:

    2a. 49% negative
    2b. 43% negative
    3. 42% negative

    Thus, a minority of the respondents selected negative responses. However, by accepted statistical standards, the sample size of 307 self-selected respondents is not adequate to justify any inference as to whether the larger population is negative or positive regarding these items.

  7. Lots of things affect what a survey says. What was asked, and how? To take another example, do you support a woman’s “right to choose” or “right to take the life of an unborn baby”. You get very different responses from the same people depending on how you ask.

    Then there’s what you ask. Are you looking for affective responses or cognitive ones?

    Et alia. Each survey stands on its own two feet.

  8. I would say that 2/3 of the responses came from within the diocese, and of that 2/3 a good 3/4 were from “ordinary” parishioners in the pews. It was also extremely interesting how on occasion people from the same parish could hold diametrically opposite views.

  9. Taking these surveys seems a little like the desire for democracy in the middle east: too often, the results are not to our liking… so, we take another, really ‘scientific poll’, and it, too, gives us results we didnt’ want. Well, obviously something (this, that, & the other) was wrong with it (or the people), so we need to take yet another one… an even more scientific one! hmmm: maybe we will like ITS results; but If not, then we’ll just have to take aonther one. (Why, we might even have to enact some emergency laws effective until we get the result we want.)

    (After all: a teenager wore a T-shirt that said ‘just say no to the new translation’, and his bishop thought it was cute. What can one say in the face of such profound and learned reflection, such wisdom, such pure intelligence?)

  10. Would it be too cynical of me to wonder why the rather low number of responses to the survey was tucked away towards the end after all the lovely pie charts?

    From Mr Inwood’s estimated breakdown (#11), within Portsmouth diocese itself, for lay faithful responses we’re looking at 3/4 of 2/3 of 307, i.e. about 155. Not really very many for a diocese with an estimated Catholic population of 163,076 (2011 figure). And not that around 206 (2/3 of 307) is that many responses from within the diocese in the first place. (Why were responses from outside the diocese taken into consideration in the first place?)

    I just don’t see how the responses of 206 people can be taken as in any way representative of the feelings in the diocese as a whole. Perhaps if the field work underlying the survey had been better, the survey might have been useful – as it is, it just seems to be a near-total waste of diocesan resources.

    1. @Matthew Hazell – comment #15:
      It’s not representative in the sense most people would use that term, and the survey is useless in that regard. It is useful as a gauge of what motivated people might say; and, if one understands that people are usually more motivated to express displeasure than pleasure in an anonymous context, one can see it in that light. Jack R is correct that the multiple choice questions are less salient (as a general matter) than the open ended questions.

      One thing are politely ignoring on both sides is that there were folks pre-implementation would predicted paradise or disaster. How have such folks avoided rationalizing what has actually happened in light of their predictions.

      1. @Karl Liam Saur – comment #16: It’s not representative in the sense most people would use that term, and the survey is useless in that regard.

        Well, quite. Though I predict that won’t stop the usual suspects claiming it as evidence that “the people” don’t accept the new translation. One only need read the last page of the published results of the survey to see how its architects wish it to be read.

        Still, considering the accuracy of the predictions of post-implementation revolution/enlightenment – which anyone with a modicum of common sense largely ignored – my own prediction will have to be taken with a pinch of salt. 🙂

        It is useful as a gauge of what motivated people might say…

        I’m not sure how useful that gauge is, regardless of whether the motivation is positive or negative (and, as you rightly point out, the motivation is more likely to be negative). It seems very echo-chamber-esque to me.

    2. @Matthew Hazell – comment #15:
      I may be unrepresentative, but I live in the Portsmouth diocese and my parish wasn’t informed about the survey. I do follow quite a few online Catholic sites (including this one, but not my diocesan site regularly) but was unaware of this survey until yesterday.

      1. @CPK Smithies – comment #19:

        A number of respondents in the diocese said that they were only informed about the survey on April 29, the day before it closed…. 🙁 It had been on the website for months and the clergy had all been informed in an Ad Clerum letter.

Leave a Reply

Your email address will not be published. Required fields are marked *