During the social indicator trend of the 1970’s, specifically in the US government, ‘Choices of indicators were not selected explicitly to address defined policy objectives, nor were they linked to policy proposals in areas where there was a public commitment to action. Indicators were not designed to characterize issues narrowly or to evaluate policies. The choices of what indicators to include may have represented results of negotiation among agency and social report staff but the task had not included public and Congressional debate. Thus, the implications of the indicators were neither salient nor known to these audiences. Moreover, there was no public obligation to examine the data during policy debate in a way that might have forced their linkage to issues.’
This is a quote from Judith Eleanor Innes’s Knowledge and Public Policy: The Search for Meaningful Indicators and it has helped me think through the idea of altmetrics for public services. In academia altmetrics is the practice of looking at the social and cultural impact of research publications by gathering data about citations from around the web- it is an extension of looking at the impact of research by the number of citations in research publications- but we can reimagine altmetrics in public services as citations made in different ways. Maybe something along the lines of… citations made is services accessed that made a positive impact on someone’s life or citations made is the sentiment from web and social media content created by citizens and stakeholders. See what I mean? Stick with me. I acknowledge these are not fully formed ideas but I think they have teeth and there are miles of space for discussion.
Some current national indicators, like ‘Widen use of the Internet’, are measured against the results of just one piece of research- the Scottish Household Survey (SHS). Of the 17,000 people asked to take part in the household survey, nearly 6,000 don’t follow through. The people approached to take part are as a random selection as possible in order to give everyone an opportunity to be selected. An issue, however, is the responses that are taken must be representative and for that reason, weighting is applied to the responses from profiles of people who tend to not take up the offer to take part in the research. For example, 60% of the interview invitations are taken up by older women with men aged 16-25 hardly taking part so the responses of men aged 16-25 are weighted as 1.2 of a response. The weighting of some answers may be considered representative in a traditional analytical community but how can government look at altmetrics for indicators like ‘Widen use of the internet’, which is ripe for near real time subjective data collection through agencies working largely with the populations who don’t take part in SHS. What creative alternative ways could other indicator owners tap into technology and digital tools that collect all kinds of data to complement single sources of information by which the indicator is measured?
Here are two ways we might look at altmetrics for Scottish public services:
- Social listening: I’ve been talking about this for some time. Social listening and online community management are pretty standard in the commercial sector. Could this same kind of activity be carried out in public services to inform how well they are performing? It might not be so straightforward right now as social listening in government bodies is a grey area. Is it actually spying when government does it? What can we do to make the rules and expectations around public service social listening clearer? Would data and information from social listening be good altmetrics for public services?
- Detailed data from public services: The creation and development of SCVO’s Good HQ could provide some incredible possibilities for using data to create altmetrics for national and local outcomes or indicators. The surge of open data activity in Scotland will also give us access to new information and creative ways to mash it up.
Though the social indicator movement has been seen by some as a failure, Innes says there is a legacy of the movement and that it helped clarify ‘more is required to inform policy than simply producing academically certified data and handing it to policy makers.’ Traditionally, at least at central government level, analysts have created a selection of indicators and outcomes based on things they know can be measured. As we go into a period of re-evaluating the Scottish National Outcomes tied up with the Community Empowerment (Scotland) Bill, let’s start thinking about the altmetrics that can be accessed to complement indicators and outcomes based on existing statistical research methods and topics. Let’s be creative and think about what we can bring into the mix. Something community based and dynamic, as real time as data can get and that can be used alongside traditional consultation as set out in the National Outcomes section of the Community Empowerment Bill.