Designing a social capital dashboard: study

Study of approaches to presenting social capital data. The research is part of the process of developing the National Performance Framework.


Executive summary

Introduction

The Scottish Government is in the process of developing and improving its National Performance Framework. As part of this work it has identified a need for improved shared understanding of, and data on, social capital in Scotland.

Given the complexity of the concept of social capital, and the range of indicators that have been used to measure it in Scotland (and more widely) to date, a key consideration for the Government is how a range of data might be brought together within the new National Performance Framework to improve understanding of the levels of social capital in different settings and over time at national and local authority level. To inform these discussions, the Government commissioned Ipsos MORI to undertake scoping work to provide an account of the likely advantages and disadvantages of developing a social capital composite measure or dashboard, and to provide information about what the process might be for designing and implementing this.

The core of the research comprised a review of six existing comparable dashboards, followed by in depth interviews with individuals who had been involved in commissioning/developing, maintaining and using these. The six dashboards were:

  • the Active Scotland Outcomes Framework
  • the European Commission’s Youth Monitor (The Situation of Young People in Europe)
  • the OECD’s ‘How’s Life?’ Dashboard
  • ONS’ Measures of National Wellbeing
  • Public Health England’s Public Health Dashboard (one of a large number of dashboards developed and maintained by PHE)
  • the ‘Understanding Glasgow’ website

This selection provided a mix of dashboards in terms of level of sophistication/design features; approaches to presenting data; levels of analysis possible; and progress monitoring.

Key findings

Defining the purpose of a dashboard

From the outset it is important that both commissioners and users of a dashboard have a clear, shared understanding of the intended purpose of the tool and how it will be used. The review pointed to ways of promoting such understanding; in particular, through the convening of a steering group or stakeholder workshop (comprising, for example, relevant policy makers, delivery partners and experts in data gathering and use), to obtain their views and feedback on the proposed dashboard, including the function it might serve/possible uses and potential issues or challenges. This was widely seen as helpful in promoting buy in – as well as in reaching agreement on indicators for inclusion.

The review also pointed to ways in which commissioners can ensure that the purpose of a dashboard remains front and centre for users on a more consistent basis. While the most obvious of these was simply including within the dashboard a clear introductory statement describing what it was for, several of the dashboards reviewed went a stage further by providing links to strategy and initiatives aimed at impacting the indicators. One (Understanding Glasgow), went further still, incorporating case studies of asset-based approaches to improving people’s lives in the city, as well as a section on “how to use the data”.

Selecting indicators for inclusion in a dashboard

Deciding on the number and type of indicators to include in a dashboard is arguably the most important, but also potentially one of the most challenging, aspects of developing such as tool. Across the dashboards reviewed, two considerations appear to have been key in determining the selection of indicators. First, there was usually a focus on identifying indicators that aligned closely with the relevant “policy architecture”. Second, the commissioners had tended to select indicators for which data was already collected and available in the public domain. On a related point, several interviewees noted the importance of considering the likely lifespan of a data source – that is, whether the organisation that commissioned it was likely to have the resources to continue running the survey in the long term.

Just one of the dashboards reviewed – the Public Health England site – included composite indicators. These took the form of ‘summary rank indicators’ that compared performance in each area of delivery across local authorities. The rankings met with a mixed reaction on the part of local authorities, with some complaining that they decontextualised performance and were meaningless in practice. Some stakeholders favoured a focus on ‘headline’ indicators instead – that is, the identification of a single indicator for each theme covered in the dashboard to illustrate the status quo.

The ONS Measures of National Wellbeing Dashboard included a variation on a composite measure, showing the overall percentage of indicators in the dashboard that had improved, deteriorated or remained unchanged over the previous year. This approach has the obvious advantage of providing a snapshot of the current situation and direction of travel, without the challenges associated with combining multiple, disparate measures.

None of the dashboards reviewed included any qualitative indicators. When interviewees were asked about this, they tended to say it was not something they had considered as dashboards were more about providing a robust snapshot.

Dashboard formats and features

Effective design helps ensure the content of a dashboard is accessible and clear to a wide audience. Any dashboard developed by the Scottish Government will of course need to meet the accessibility standards required of all government publications, which demand that data must be presented in a clear and unambiguous manner, but the design will also need to go a step further, to ensure that what the data actually means is self-evident. One of the five commitments set out in the 2016 Scottish Open Government Partnership Action Plan specifically mentions: “making understandable information available through the National Performance Framework, which will be reviewed to reflect our commitments to Human Rights and the Sustainable Development Goals”.

At the same time, there is no single ‘correct’ way of approaching design. Indeed, ‘good’ design may be more accurately characterised as design that is ‘fit for purpose’. This reinforces the importance of clarity around the intended purpose of a dashboard, whilst also providing a case for the involvement of design professionals in discussions of purpose.

An initial practical consideration in terms of the format of a dashboard is where it should be hosted. For the most part, the dashboards reviewed were embedded within the commissioners’/developers’ websites; the exception being Understanding Glasgow which comprised a dedicated microsite. Microsites allows greater freedom and control over content, though this must be balanced against the greater cost and technical resource required to develop and maintain them.

The dashboards reviewed contained both text and visuals elements, with the focus primarily on the latter. Pages that required more description (for example, about the purpose of the dashboard) were kept separate from the presentation of the data thus helping to keep the overall ‘clean’ look. Charts were overwhelmingly the main type of visual used, though some of the dashboards contained infographic elements, videos and animations, and/or interactive maps to explore geographical variation in the data. The reviewed identified three factors that needed to be borne in mind when considering such content options. Firstly, the design of the dashboard can only be as sophisticated as the IT systems underpinning it. Secondly, there should be a realistic assessment of the financial and technical resources required and available. Thirdly, it is important to remember that such resources will also be required on an ongoing basis to ensure the dashboard is functioning as it should.

Resourcing

All of the dashboards reviewed had been developed in-house as this was more cost-effective than outsourcing. The main challenge of in-house development centred on the availability (or lack thereof), of staff with the requisite technical skills.

Interviewees who had been involved in developing dashboards agreed that the most resource intensive phase was identifying and agreeing the intended audience and content for the dashboard. The agreement of content for Understanding Glasgow took a year but there were also examples of shorter timescales (for example, four to six months).

It was not always possible to obtain similarly firm estimates of the resources required for subsequent development work (excluding technical development) but the review did ascertained that the European Commission Youth Monitor involved one person working intensively on it for one year; while the Public Health England Dashboard involved three or four members of the core team spending around half a day per week on it (the core project team included the head and deputy head of an analytical team, a methods expert, a principal analyst and an analyst), in addition to ongoing technical support from the internal development team (which supports all Public Health England’s dashboards).

In terms of technical development there was agreement that, with the range of software options now available (many of which are free of charge), this should be a relatively quick and inexpensive process – though, it does require specific design/development expertise. Participants commonly recommended designing the dashboards to be as ‘future proof’ as possible, to avoid having to make large-scale changes to them. This included ensuring they were device agnostic and used coding which is more modular or generic rather than ‘hard’.

In terms of ongoing development and maintenance, four factors appeared key in determining the resources required for this: how future-proof the dashboard was; how frequently the indictors would have to be updated; whether data needed to be externally sourced; and the extent to which updates were automated. Further, and beyond basis updates, several interviewees highlighted the ongoing resource required to continue to develop and promote the dashboards, and to respond to user enquiries.

Conclusions

Stage 1: Assess the feasibility and value
Convene a group of key stakeholders, potentially including policy makers, delivery partners, academics and data experts such as ONS to consider the potential value of a social capital dashboard in general and specifically as part of the NPF, and agree on the potential audience for the tool.

Decide whether to develop a social capital dashboard and the level of resource. If a decision is made to continue, establish a steering group of key stakeholders to oversee this work and promote it.

Stage 2: Agree the intended audience and broad content
Develop an explicit statement on the purpose of a social capital dashboard and its intended users (and be clear about what it is not for.) Agree the high level definition of social capital and the concepts that the dashboard will measure. Develop criteria against which data quality will be assessed and potential indicators can be assessed (e.g. ability to provide robust, nationally representative data, at both aggregate and sub group level that can be updated regularly; likely lifespan of data etc.)

Stage 3: Technical development
Consider the existing administrative and survey data on social capital, and the extent to which this data is included elsewhere in the NPF and where it fits best in relation to agreed criteria (including NPF processes for the number of indicators to be included). Consider the design of the dashboard, how it looks, the extent of any data visualisation, linkages to the NPF and Scotland Performs website and explanatory text.

Consider the level of analysis in relation to geography and the extent to which data can be broken down by demographics. Consider how change and comparative data will be presented (e.g. will local authority data be presented in relation to Scotland as a whole, similar LAs or high performing LAs?) Consider how the dashboard should be updated and maintained and how user feedback will be collected. Consider how the dashboard can be made as accessible as possible given the financial and technical resources required to maintain it.

Stage 4: Testing and pre-launch
Process of piloting and collecting user/stakeholder feedback and refining dashboard.
Consideration of communication for a ‘formal’ launch of the dashboard.

Stage 5: Dashboard goes live
On-going review and maintenance of the dashboard.

Taken as a whole, the review findings may be said to provide a possible route map for developing a social capital dashboard for Scotland. This can be summarised as follows:

Contact

Email: Ben Cavanagh

Back to top