More Appropriate Information Systems and Services for the Social Scientist: Time to Put Our Findings to Work

R. Laval Hunsucker


A review of:
Line, Maurice B. “The Information Uses and Needs of Social Scientists: An Overview of INFROSS.” Aslib Proceedings 23.8 (1971): 412-34. Rpt. in Lines of Thought: Selected Papers. Ed. L.J. Anthony. London: Bingley, 1988. 45-66.

Objective – The study reported in this article was conceived in order to answer a question of very large scope: What are the information systems and services requirements of social scientists? Inherent in this question was the correlative question: How do social scientists tend to use such systems and services, and what resources and information access approaches do they by choice employ? The choice for such an approach was well-considered, given that 1) there were at the time almost no research results available in this area; 2) the investigators feared that approaches developed earlier for the natural sciences and technology would be uncritically adopted for the social sciences as well; and 3) “the social science information system was developing anyway, and if it was to develop in appropriate ways, some guidance had to be provided quickly” (412). The Investigation into Information Requirements of the Social Sciences (INFROSS) project team believed that there was “no point” (412) in embarking first on a series of more narrowly focused studies. The express intention was to derive findings that would be usable “for the improvement of information systems, or for the design of new ones” (414). For more on the project's conceptual underpinnings, see Line’s “Information Requirements.”

Design – Exploratory study employing both quantitative and qualitative approaches over a period of three and a half years, beginning in the autumn of 1967.

Setting – The whole of the United Kingdom. The project was funded by that country’s Office for Scientific and Technical Information (OSTI), which had been established in 1965.

Subjects – Almost 1,100 randomly selected academic social science researchers, plus a substantial number of government social science researchers and social science “practitioners” (“college of education lecturers, schoolteachers, and individuals in social work and welfare” [413]). For the purposes of the study, the social sciences included anthropology, economics, education, geography, political science, psychology and sociology, but numerous historians and statisticians ultimately participated.

Methods – Three methods were employed: surveys, interviews, and direct observation. A “very long” (413) questionnaire was sent to 2,602 of the identified ca. 9,100 social science researchers in the United Kingdom, with 1,089 (41.8%) completed questionnaires returned. Two pilots were conducted with the questionnaire before a definitive version was finalized for the study. Seventy-five interviews were conducted (individually or in groups) with researchers, some of whom had received but not responded to the questionnaire, and some of whom were not included in the questionnaire sample. The interviews with non-responding persons in the sample were for purposes of determining “whether they were non-typical” (413). Fifty additional interviews were conducted (individually or in groups) with practitioners. Day-to-day observation of a small number of social scientists was undertaken in the context of a two and a half year-long experimental information service at Bath University – the first time any UK university had employed information officers for the social sciences.

Main results – The results showed a pronounced perception among social scientists that informal “methods of locating references to relevant published information” (416-8, 426-7, 431) are more useful than formal methods (such as consulting the library catalogue, searching library shelves, or searching in indexing and abstracting publications), and an even more pronounced inclination to actually use such informal methods – something of a revelation at the time. Less than one sixth of all sociologists, for example, made use of Sociological Abstracts. On both counts, “consulting librarian” (418) scored worse than all the other ten options. Forty-eight percent of respondents never did it, and only 8% perceived it as a “very useful” (418) method. Nonetheless, 88% of respondents were in principle prepared to delegate at least some of their literature searching, and approximately 45% all of it, “to a hypothetical information officer” (425). More than 75% of the experimental service clients also responded affirmatively to the question: “Should a social science information officer be a high priority,” given limited available resources? (Line, Cunningham, and Evans 73-5). Most subjects found, in any case, that their major “information problems” (427-8) lay not in discovering what relevant documents might exist, but rather in actually getting their hands on them. In only around 20% of the cases were they ultimately successful in doing so. The younger the researcher, the greater the dissatisfaction with her/his own institution’s collection. This study also revealed that academic social scientists drew little distinction between information needs for their research and those for their teaching.

There was one social science discipline which clearly stood out from the rest: psychology. Psychologists were the heaviest users of abstracting and indexing (A&I) publications, as well as of the journal literature, published conference proceedings, and research reports. They were also the least tolerant of time lags in the A&I services’ coverage of new publications.

Further significant findings were:
• A librarian’s way of categorizing research materials was not very meaningful to the researchers themselves.
• A&I services were generally used more often for ‘keeping up’ than for retrospective searching.
• Consultation with librarians was more common in the less scholarly and more intimate college environment than at research institutions.
• A large percentage found library cataloguing insufficiently detailed. The same was true for book indexes.
• There was considerable enthusiasm for the idea of a citation index for the social sciences. (N.B.: the SSCI began publication two years after the appearance of this article.)
• Among informal methods of scholarly communication and information transfer, conferences (to the investigators’ surprise) rated remarkably low.
• Researchers with large personal collections made more use of the library and its services than those with small collections.
• Social scientists had little interest in non-English-language materials. Line speaks of “a serious foreign language problem” (424).

The INFROSS study produced an enormous amount of data. Only 384 of the computer tables produced were made available in 4 separate reports to OSTI. Only 3 tables, 2 of which were abbreviated, appeared in this article. The further raw data were available on request.

Conclusion – Line himself was exceedingly cautious in drawing explicit positive conclusions from the INFROSS results. He even stated that, “No major patterns were detected which could be of use for information system design purposes” (430). He was freer with his negative and provisional assessments. Two years earlier he had written: “It still remains to be established that there is an information problem in the social sciences, or that, if there is, it is of any magnitude” (“Information Requirements” 3). However, it was now clear to Line that information services and systems for the social scientist were indeed quite inadequate, and that (potential) users were not satisfied.

He was, furthermore, prepared to go out on a limb with the following assertions and inferences:
1) It was a great strength of INFROSS that it had – in marked contrast to previous science user studies – generated “a mass of comparable [his italics] data within a very broad field, so that every finding can be related to other findings” (430).
2) There are discernable – and exploitable – differences in the information needs and use patterns among the different social science disciplines (which he often also refers to as the different “subjects”).
3) INFROSS had likewise made more evident the nature of similarities across disciplines.
4) There is indeed, from an information/library perspective, a continuum from the ‘harder’ to the ‘softer’ social sciences.
5) Social scientists showed too little awareness, made too little use, and even displayed “insufficient motivation” (431) to make use of available information systems/services. He elsewhere (“Secondary Services” 269, 272) characterizes them as “remarkably complacent,” “even apathetic.”
6) There is good reason to doubt the wisdom of libraries’ investing in user education, since it is bound to have little effect (for further discussion of this matter, one can consult his “The Case for” 385-6 and “Ignoring the User” 86).
7) User-friendly systems amount inevitably to underdeveloped and ineffective systems – and therefore “personal intermediaries,” in sufficient numbers, will remain essential if we wish to offer social scientists really good information services (426, 431).

Line believed that INFROSS was only a beginning, and he had already, even before writing this article, begun follow-up research aimed at attaining results really of use for information system design purposes (e.g., the DISISS project). He complained many years later, however, that all this research “indicated means of improvement, but led to no action” (“Social Science Information” 131). In any case, “Bath” (the common shorthand subsequently used to refer to all this research) became, and has remained, the starting point for all subsequent discussions of social science information problems. Several years ago, there was a well-argued international call for “a new and updated version of the INFROSS study” – with an eye to finally using the findings for practical purposes, and aiming “to extend and follow up the research agenda set by the original study” (Janes “Time to Take”).

Full Text:


Evidence Based Library and Information Practice (EBLIP) | EBLIP on Twitter