Published on in Vol 7 (2023)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/44382, first published .
Web-Based Public Reporting as a Decision-Making Tool for Consumers of Long-Term Care in the United States and the United Kingdom: Systematic Analysis of Report Cards

Web-Based Public Reporting as a Decision-Making Tool for Consumers of Long-Term Care in the United States and the United Kingdom: Systematic Analysis of Report Cards

Web-Based Public Reporting as a Decision-Making Tool for Consumers of Long-Term Care in the United States and the United Kingdom: Systematic Analysis of Report Cards

Original Paper

1Chair of Health Care Management, Institute of Management, Friedrich-Alexander-Universität Erlangen-Nürnberg, Nürnberg, Germany

2School of Public Health, Universität Bielefeld, Bielefeld, Germany

Corresponding Author:

Kristina Kast, MSc

Chair of Health Care Management

Institute of Management

Friedrich-Alexander-Universität Erlangen-Nürnberg

Lange Gasse 20

Nürnberg, 90403

Germany

Phone: 49 911530296393

Fax:49 911530295285

Email: kristina.kast@fau.de


Background: Report cards can help consumers make an informed decision when searching for a long-term care facility.

Objective: This study aims to examine the current state of web-based public reporting on long-term care facilities in the United States and the United Kingdom.

Methods: We conducted an internet search for report cards, which allowed for a nationwide search for long-term care facilities and provided freely accessible quality information. On the included report cards, we drew a sample of 1320 facility profiles by searching for long-term care facilities in 4 US and 2 UK cities. Based on those profiles, we analyzed the information provided by the included report cards descriptively.

Results: We found 40 report cards (26 in the United States and 14 in the United Kingdom). In total, 11 of them did not state the source of information. Additionally, 7 report cards had an advanced search field, 24 provided simplification tools, and only 3 had a comparison function. Structural quality information was always provided, followed by consumer feedback on 27 websites, process quality on 15 websites, prices on 12 websites, and outcome quality on 8 websites. Inspection results were always displayed as composite measures.

Conclusions: Apparently, the identified report cards have deficits. To make them more helpful for users and to bring public reporting a bit closer to its goal of improving the quality of health care services, both countries are advised to concentrate on optimizing the existing report cards. Those should become more transparent and improve the reporting of prices and consumer feedback. Advanced search, simplification tools, and comparison functions should be integrated more widely.

JMIR Form Res 2023;7:e44382

doi:10.2196/44382

Keywords



An increasing number of people worldwide have functional or cognitive impairments and are dependent on others to manage their everyday lives [1]. This often requires one to move to a long-term care facility. However, the search for a suitable facility is often complex and calls for an assessment of information about the quality of care. Report cards with freely available information on facilities’ performance can help to make an informed decision.

The method of making health care quality information publicly available and transparent to consumers is called public reporting [2]. The aim is to improve the quality of health care services [3]. Public reporting is expected to work in 2 ways to achieve this goal. First, it shows providers how they perform compared with other providers and gives them the opportunity to undertake measures for performance improvements (change pathway). Second, public reporting enables consumers to compare facilities and distinguish well-performing from poor-performing providers, so they can choose the best one (selection pathway) [4-6]. As a consequence, this not only enables consumers to make an informed decision but also activates providers (motivation) to actually undertake improvement measures [4].

From an international perspective, the United States and the United Kingdom are 2 countries with a long tradition of public reporting [7]. In the United States, public reporting was initiated in the hospital sector in 1754 [8]. In 1984, the Institute of Medicine (today the National Academy of Medicine) drew attention to the poor quality of nursing homes [9], and public reporting was subsequently adopted within the long-term care sector. Since 1998, it has become mandatory for long-term care facilities to regularly submit selected facility and resident data to a national database (Minimum Data Set), on which quality indicators (eg, ulcer prevalence, restraint use, and mobility improvement) are analyzed by the Centers for Medicaid and Medicare (CMS) [9,10]. Since 2002, this information, which also includes inspection results performed by the CMS, has been published on the national report card Nursing Home Compare (Medicare.gov) [10,11].

In the United Kingdom, the reporting of hospital performance information dates back to 1860, when the first systematic reporting of hospital mortality rates began [8]. In the UK long-term care sector today, facilities are encouraged to monitor and report their own performance data to make quality transparent and to improve the quality of their services. It is mandatory for facilities to undergo the annual quality inspections performed by authorized health care agencies. In England, the Care Quality Commission (CQC), founded in 2008, is the agency regulating and inspecting health and social care [12]. The CQC assesses the facilities in terms of patient safety, effectiveness, and other aspects [13]. The results of these inspections are published on the CQC website [14] and on the report card NHS Choices [15]. In Scotland, the Care Inspectorate, established in 2011, is responsible for the registration and inspection of long-term care facilities [12,16]. It assesses the quality according to 6 grades (from “unsatisfactory” to “excellent”), considering aspects such as staffing or management quality [17,18]. The inspection results are available on the Care Inspectorate website [19].

By now, many new report cards have been created, which provide information on long-term care facilities [2]. By analyzing those report cards, this study can contribute to the body of literature by revealing the current situation of web-based public reporting as a decision-making tool for consumers and indicating further development potential for the countries themselves. It can also serve as a basis for a wide variety of further studies that assess “selection pathway”–measures in long-term care settings. As for the US- and UK-inspired report cards in other countries, it can indicate to what extent they should be reevaluated regarding their own public reporting.

Regarding academic discussions on public reporting, several international studies focused on long-term care and included the United States, the United Kingdom, or both in their comparisons. Some of them studied public reporting in general. For example, they compared the effectiveness [20] or validity [21] of quality indicators used in the long-term care sector of those countries, studied what quality information people prefer when choosing a long-term care facility [22], and gave implications for more effective public reporting [23].

There are also some international studies with a focus on web-based public reporting. For example, du Moulin and colleagues [24] compared the official websites of the quality initiatives of 14 countries and found that the quality indicators on them varied in type and number across the countries, while the outcome indicators received little attention. Similarly, Rodrigues and colleagues [3] provided a comparative overview of quality indicators published on selected websites of 7 countries and appealed for better design of report cards. Rechel and colleagues [25] surveyed key national informants of 11 countries about quality information the countries published on report cards. The study had, however, a strong focus on hospitals. On long-term care, the authors reported that only a few countries displayed quality information as an overall rating. Damman and colleagues [26] conducted an internet search on 10 countries and found 42 report cards. The authors reported that many of them offered no quality information at all. This study, however, also covered different health care sectors.

While there are several international comparative studies on long-term care public reporting in different countries, including the United States and the United Kingdom, they either did not focus on report cards [20-23], addressed only selected report cards [3,24], or combined report cards of different health care sectors [25,26]. With this study, we aimed to analyze the current state of public reporting in the United States and the United Kingdom, with a focus on report cards for the search of long-term care facilities. Therefore, we addressed the following questions: (1) What report cards are available for consumers in the United States and the United Kingdom when using the internet to choose a long-term care facility? and (2) What types of quality information do these report cards provide and how?


Search Strategy

We decided on the United States and the United Kingdom due to their long experience with public reporting but also because of language understanding. The fundamentals of similarly designed studies from other settings and countries served as the structural basis of this study. We systematically searched for the US and UK websites, which provide information on the performance of long-term care facilities to the public. The search was conducted in mid-2020 and updated in mid-2021. As research suggests, we identified the 2 most popular search engines [27]. Google and Bing had the highest market shares based on page views referred by a search engine in both countries at that time [28]. To avoid bias in accessing the search engines due to the location of the researchers in Germany, we used internet access through the virtual private network CyberGhost [29], to simulate the search from the United States and the United Kingdom. In addition, to avoid potential influence from past search behavior, we set up a new version of a browser for the search. Before drawing a sample of websites, we constructed a search strategy following previous research [27]. We used terms such as “compare nursing homes,” “nursing home rating,” “find care homes,” and “choose the right care home” (see Table S1 in Multimedia Appendix 1 for more details), following previous literature [30-32].

Sampling of Websites

Search engines typically display 10 hits per search query by default. The upper organic hits (ie, without ads) are usually the most noticed. After that, the likelihood of being noticed decreases [33]. Furthermore, over 70% of clicks are made on the first page, and from the third page onward, the click probability is less than 2% [34]. For these reasons, we considered only the first 50 hits per search engine, per search term, and per country, which resulted in a total sample of 1500 websites.

After removing duplicates, we excluded websites not related to health care. We then checked each individual link based on predefined inclusion criteria. We were interested in websites that allowed users to search for a facility nationwide, provided access to quality information without registration, and gave users the possibility to choose from a range of facilities from different providers (ie, not the home pages of single providers). Those websites had to provide at least 1 instance of quality information, which could be either objective (eg, pressure ulcer) or subjective (eg, consumer reviews) [35]. We excluded websites if they provided no quality information but only background information about long-term care or if they reported quality information only on other types of care (eg, respite care or home care).

Sampling of Facility Profiles

As suggested by Kast and colleagues [31], on the report cards that met the inclusion criteria, we searched for long-term care facilities in specific cities. Due to the geographic size and population of the countries (the United States with over 300 million inhabitants and the United Kingdom with about 65 million) [36], 4 cities were chosen for the United States and 2 for the United Kingdom. For the former, we chose New York as a metropolitan city with millions of inhabitants and a high price segment. Miami, Portland, and Denver were chosen as medium-sized cities with between 500,000 and 1 million inhabitants. These cities are in the interior or on the coasts of the country. They have a very high proportion of internet users as well as a growing and similar proportion of older people to the national population [36,37]. Portland and Miami are among the most popular areas for retirement [38]. Similarly, in the United Kingdom, London as a metropolitan city covers the southern part, and Glasgow as a medium-sized city covers the northern part of the country. Compared to the national average [36], the 2 cities have a higher-than-average proportion of older people. Both cities have a large proportion of internet users [39].

For each included report card and city, we considered a sample of 10 facilities as suggested by Kast and colleagues [31]. As we intended to obtain a comprehensive view of the information report cards are able to provide, we needed to analyze more sophisticated profiles. We expected to find such profiles in the described areas rather than in rural regions. In cities, the competition between facilities might be more intense and consumers might be more engaged in using internet-based information. Both can lead to a more active use of profiles by facilities. In total, we screened 1320 facility profiles. Based on those profiles, we analyzed the information provided by the included report cards.

Data Extraction and Analysis

Figure 1 shows an example of a report card and highlights some relevant elements of the data extraction on this report card. From each included report card, we first collected general information, for example, as suggested by Emmert and colleagues [35], information websites reported about themselves—the intention of the report card (focus) and the origin of the information (data source). We then collected information about the functional scope, as suggested by Kast and colleagues [31]. We checked whether users could communicate their satisfaction with a facility to others. In addition, we examined the possibilities of customizing the search. It included a simple search field or an advanced search. The former generally requires users to specify what they are looking for and where, for example, a nursing home under a specific postal code. An advanced search allows the users to specify the search in more detail with additional customization of distance, facility type, or both. Furthermore, we considered simplification tools, which allow users to handle the complexity of information [40]. Such tools could allow for adjusting quantity per page or sorting and filtering the hit list. Additionally, we checked for the ability to compare several facilities with each other.

Figure 1. Example of report card elements (shortened for better comprehensibility).

We checked whether quality information was available. For objective quality, we extracted the following 3 types of information based on the work of Donabedian [41]: structure, process, and outcome quality. These types have been used in quality-of-care studies [26,42-44]. In this study, structural quality, for example, refers to location, wheelchair accessibility, staffing ratio, and facility equipment. Process quality covers aspects such as the use of physical restraint, pressure ulcer prophylaxis, or the involvement of relatives in decision-making processes. Outcome quality encompasses, for example, the number of residents with pressure ulcers or unintentional weight loss. As suggested by previous studies [31,35], we added further types to these 3 types of information. We checked the availability of price information on report cards. We also examined whether the inspection results based on the official quality checks (eg, from the CMS, CQC, or other quality initiatives) were presented. For subjective information, we examined whether consumer experience and satisfaction data (consumer feedback) were displayed.

On report cards with inspection results, we analyzed the presentation format of this information. We distinguished composite measures [35] (eg, a 5-star quality rating), either for different areas (eg, health outcomes of residents, staffing, or nursing care) or for the overall quality of a facility. We also checked for alternative information presentations, such as a downloadable quality report [31], or a link to another website (eg, official CMS website). On report cards, which provided consumer feedback, we distinguished between presentation formats such as a scaled quality assessment of defined areas (eg, food quality) and free-text comments, as suggested by Emmert and colleagues [35].

All mentioned data were extracted in Microsoft Excel (Microsoft Corporation) sheets using the described types of quality information, presentation formats, and functions. Any described type was considered covered as soon as we observed at least 1 instance of quality information, presentation format, or function associated with the respective type. Thus, for each type, we stated whether it was covered on the respective website (yes or no) [30]. After data extraction, each type was analyzed descriptively using total numbers and percentages.

Ethical Considerations

Ethics approval was not required for this study since all data was publicly available and there were no human participants.


Identified Report Cards

We identified 40 report cards (Table 1): 26 (65%) in the United States and 14 (35%) in the United Kingdom (for URL and provider information of the report cards see Table S2 in Multimedia Appendix 1). In total, no report card restricted the search to long-term care facilities only (Table 2). Instead, the websites also included other care services for older people (eg, hospice care or home care), other providers (eg, physicians or hospitals), or nonmedical services (eg, hairdressers). In the United States, most (n=17, 65%) report cards focused on care services for older people, compared with less than half (n=6, 43%) in the United Kingdom.

Table 1. Process and result of report card identification.
Phases and processesUS report cards, nUK report cards, n
Identification

Drawn sample600900
Removal

Duplicates313445
Remaining after removing duplicates287455
Screening by selection criteria

Not health care28135

Background information146271

Not nationwide search5611

Registration required64

Error2515

Hand search6a6a
Potentially relevant after screening3225
Data collection

Background information12

Not nationwide search18

Other kinds of facility40

Not available anymore01
Report cards included2614

aIdentified during the screening phase and added for the data collection phase.

Table 2. General information about report cards.
InformationUS report cards (N=26), n (%)UK report cards (N=14), n (%)Total report cards (N=40), n (%)
Focus on

Providers of care services for older people17 (65)6 (43)23 (58)

Different health care providers2 (8)2 (14)4 (10)

No specific focus on health care7 (27)6 (43)13 (33)
Data source

Inspections results (eg, Centers for Medicaid and Medicare)6 (23)2 (14)8 (20)

Consumer or provider information (eg, comments or provider registration data)10 (38)4 (29)14 (35)

Inspection results combined with consumer or provider information4 (15)1 (7)5 (13)

Copied from publicly available information (eg, Nursing Home Compare website)2 (8)N/Aa2 (5)

Not reported4 (15)7 (50)11 (28)

aN/A: Not applicable.

In both countries, the data provided on the report cards had different origins (Table 2 and Table S3 in Multimedia Appendix 1). Most of the report cards (n=10, 38% in the United States and n=4, 29% in the United Kingdom) indicated that they use data provided by service consumers (eg, user comments) and service providers (eg, nursing home providers registered on the respective website) only. Another often-reported data source was the data from inspections conducted by government agencies (n=6, 23% in the United States and n=2, 14% in the United Kingdom). Some report cards combined both types of data sources (n=4, 15% in the United States and n=1, 7% in the United Kingdom). On half of the UK websites (n=7, 50%), no information on the data source was given. On the US websites, this was the case for only 4 (15%) websites.

Functions of Report Cards

Functions are shown in Table 3 and Table S4 in Multimedia Appendix 1. While a simple search function was available on all identified report cards, only one-fifth (n=7, 18%) of them in both countries (n=2, 14% of those in the United Kingdom) offered an advanced search field. Other functions were equally available in both countries. In total, on more than half of the websites (n=24, 60%), users could customize the hit list using various simplification functions after the search has been performed. The sorting option was the most frequent (n=21, 88%) simplification function, followed by filters (n=15, 63%), and the possibility to change the number of facilities shown (n=3, 13%). An even rarer (n=3, 8%) function was the ability to compare multiple potential facilities on the internet. In contrast, the option for users to leave feedback on a particular facility was offered by about half of the report cards (n=21, 53%). After registration, users could rate facilities according to certain predefined criteria (eg, friendliness of staff and quality of food) or leave a free-text comment.

Table 3. Functions on websites.
FunctionUS report cards (N=26), n/N (%)UK report cards (N=14), n/N (%)Total report cards (N=40), n/N (%)
Kind of search field

Single26/26 (100)14/14 (100)40/40 (100)

Advanced5/26 (19)2/14 (14)7/40 (18)
Simplification tools

No11/26 (42)5/14 (36)16/40 (40)

Yes15/26 (58)9/14 (64)24/40 (60)


Sort13/15 (87)8/9 (89)21/24 (88)


Filter8/15 (53)7/9 (78)15/24 (63)


Adjust2/15 (13)1/9 (11)3/24 (13)
Internet-based comparison

No24/26 (92)13/14 (93)37/40 (92)

Yes2/26 (8)1/14 (7)3/40 (8)
Consumer feedback allowed

No12/26 (46)7/14 (50)19/40 (47)

Yes14/26 (54)7/14 (50)21/40 (53)

Quality Information and Its Presentation

Table 4 and Table S5 in Multimedia Appendix 1 show details on types of quality information and presentation formats. All report cards showed information on structural quality. In total, less than half of them (n=15, 38%) reported information about processes in facilities; this kind of information was found more frequently in the United Kingdom (n=7, 50%) than in the United States (n=8, 31%). On the contrary, information on the quality of outcomes was only reported in the United States (n=8, 31%). Prices were reported on one-third (n=12, 30%) of all report cards. The subjective information “consumer feedback,” on the other hand, was (with the exception of the structural quality) in both countries more common (n=27, 68%) than objective quality information.

Table 4. Types of quality information and presentation format.
Quality information and presentation formatUS report cards (N=26), n/N (%)UK report cards (N=14), n/N (%)Total (N=40), n/N (%)
Structure

No0/26 (0)0/14 (0)0/40 (0)

Yes26/26 (100)14/14 (100)40/40 (100)
Process

No18/26 (69)7/14 (50)25/40 (62)

Yes8/26 (31)7/14 (50)15/40 (38)
Outcome

No18/26 (69)14/14 (100)32/40 (80)

Yes8/26 (31)0/14 (0)8/40 (20)
Prices

No19/26 (73)9/14 (64)28/40 (70)

Yes7/26 (27)5/14 (36)12/40 (30)
Inspection results

No10/26 (38)6/14 (43)16/40 (40)

Yes16/26 (62)8/14 (57)24/40 (60)


Overalla15/16 (94)5/8 (63)20/24 (83)


By areab14/16 (88)6/8 (75)20/24 (83)


Detailsc8/16 (50)1/8 (13)9/24 (38)


Reportd3/16 (19)0/8 (0)3/24 (13)


Linke1/16 (6)7/8 (88)8/24 (33)
Consumer feedback

No8/26 (31)5/14 (36)13/40 (32)

Yes18/26 (69)9/14 (64)27/40 (68)


Scaled18/18 (100)9/9 (100)27/27 (100)


Comment13/18 (72)7/9 (78) 20/27 (74)

aComposite measure as overall rating.

bComposite measure as rating by area (eg, health inspections, staffing, and resident care).

cDetailed information from inspections by area.

dReports for download.

eLink to the authority website with detailed inspection result.

On all 27 report cards with consumer feedback, it was presented in the form of scaled ratings. This means that certain predefined areas, such as cleanliness or food quality, were rated by users on a scale and displayed in the form of stars, for example. Many (n=20, 74%) of those ratings were supplemented by comments written by users in free text. The quality of structure, processes, and outcomes is usually information from inspection controls. Far more than half (n=24, 60%) of the identified report cards made use of this information. Most of them were presented as composite measures, which showed the overall quality (n=20, 83%) or were separated by area (n=20, 83%) such as health outcomes, staffing, or nursing care. In the United Kingdom, most of the 7 report cards with inspection controls–based information displayed it as a link to the website of the responsible authority (eg, the CQC or Care Inspectorate), where users could see inspection results in more detail. The US websites provided such links less often (n=1, 6%), but in some cases, they provided such information in detail on the respective report card (n=8, 50%) or as a complete CMS report with inspection results as a file for download (n=3, 19%).


Principal Findings

Many people need to move to a long-term care facility. Web-based public reporting can help find a suitable one. With this study, we aimed to analyze the current state of such report cards in the United States and the United Kingdom, 2 countries with a long tradition of public reporting. We found 40 report cards, which allowed a nationwide search for a long-term care facility and provided free access to quality information. This study shows that many aspects were well covered on those websites. Most report cards focused explicitly on long-term care, structural quality information was always provided, users had the possibility to share experiences, and medical information from inspections was always displayed as composite measures. However, our findings suggest that there are also some deficits on many report cards in the United States and the United Kingdom. Both countries poorly communicated the source of data and sparingly reported prices and consumer feedback. Many of them did not provide an advanced search function, simplification tools, or comparison functions.

Kumpunen and colleagues [2] reported several years ago about the rapid and substantial growth of the number of websites. This required an extra website to guide users to find the relevant information. Considering this, we expected to find a greater number of report cards. However, we also know that consumers often use only 1 report card as a basis for decision-making [45]. Therefore, it might be even more important to optimize existing report cards instead of expanding their number. Below, we explain the problems associated with the stated deficits and provide important implications for improvement.

Implications

As mentioned above, the aim of public reporting is to improve the quality of health care services [5]. Providers are those who can undertake measures to change their services [6]. However, as emphasized by Berwick and colleagues [4], this strongly depends on the intrinsic motivation to perform better than others and is not a reliable strategy in a complex system such as health care. Conclusively, the “selection pathway” is even more important as it creates pressure through the consequences of the choice by consumers and motivates providers to undertake improvement measures [4]. However, the realization of this pathway can fail because users are dissatisfied with report cards and do not take them into account when making the final decisions [46]. For this reason, it is necessary to make report cards more helpful for users [47,48].

First, it is important to be transparent about where the data came from since this is an essential aspect of credibility [49-51]. We found that in total, 11 (28%) of the 40 report cards, and especially many (7/14, 50%) in the United Kingdom, did not state the source of the provided information. Not reporting the data source can represent a great obstacle to the effectiveness of public reporting instruments [52], as consumers doubt the trustworthiness of the provided quality information and often refrain from using report cards [50]. Thus, we recommend that both countries and especially the United Kingdom, more clearly communicate the source of data they use.

Second, some types of information are more important to users than others. Research shows that consumers are more likely to decide on such criteria as prices or recommendations than on medical information [53-56]. Thus, to be useful for consumers, report cards should provide quality information that is of interest to them. In both countries, prices in this sector are a relevant criterion when choosing a facility, as the monthly out-of-pocket payments are high (US $7908 in the United States [57] and £3552 [US $4280] in the United Kingdom [58]). We found, however, that websites insufficiently reported prices (12/40, 30%). Particularly in the United States, there is room for improvement in price transparency, considering that only 7 (27%) of 26 report cards showed price information, and 3 (43%) of those 7 would only reveal details after registration. Similarly, consumer feedback was missing on one-third (13/40, 32%) of the identified report cards in this study. Hefele and colleagues [59] emphasize that the lack of subjective perspective on report cards pushes consumers to use information on social media, yet its validity is still unclear [60,61]. Consumer feedback can be a valuable supplement to objective quality information; it can strengthen the attractiveness of public reporting and improve the quality of decision-making [62-64]. Thus, especially in the United States, price transparency should be improved. In both countries, more report cards should supplement the objective quality information with subjective consumer feedback.

Third, it is not enough to show the content users prefer to help them make an informed decision. Sometimes the amount of information can be overwhelming for users [65]. Different studies showed that people wish for a lot of different quality information, but it increases the complexity without necessarily improving decision-making [65-70]. To counteract this problem of overestimating one’s own ability to process large amounts of information, it is important for report cards to include customizable formats [40]. In this study, however, only a few report cards allowed for customization. At the beginning of the search process, most report cards only offered a simple search function. While this function may have different levels of technical sophistication and deliver different results depending on search behavior [71], an advanced search would give users more possibilities to customize the search based on their own preferences and would help them find relevant information more easily and more quickly. To reduce the complexity during the screening of the hit list, simplification tools could be helpful [40]. This study shows that many report cards used at least 1 of such tools (eg, filters), but 40% (16/40) did not.

Furthermore, users may find it difficult to compare multiple potential facilities from a hit list. An internet-based comparison of several facilities allows users to get a first impression of the performance of the facility in question and to avoid errors in the interpretation of the quality information [72]. In addition, users do not have to gather information themselves in a time-consuming manner [73]. Although the comparison function on report cards seems to be a useful function according to the literature, this study shows that this function was especially rare on report cards in both countries (2 in the United States and 1 in the United Kingdom). Thus, we recommend that website managers in both countries more intensively integrate the advanced search function and simplification tools on the websites and more often provide the opportunity to compare several potential facilities.

All in all, it seems that, for several reasons, web-based public reporting in its current form cannot be a helpful decision-making tool for consumers. At present, users do not have the opportunity to assess the trustworthiness of information, the displayed quality information does not always correspond to their preferences, and users are not provided with enough functions to handle the complexity on report cards. Improving the identified websites would not only help consumers make an informed decision on the selection of a long-term care facility (selection pathway [4]). Moreover, it could make the other parts of the chain suggested by Berwick and colleagues [4] work. By the consequences of the choice, it could increase the odds of activating the motivation of providers to act on quality improvements, and thus, to improve the quality of health care services.

Significance of the Study

This study reveals that addressing the current deficiencies can contribute to more effective public reporting and improved long-term care services for older people who are dependent on others. According to our assessment, many of these specific problems are easy to fix (eg, functions). The study benefits various groups associated with long-term care. As for the practical side, this study points report card managers to the most pressing issues, helps health care agencies better understand the information preferences of consumers, and motivates long-term care facilities to provide more quality information on their profiles on report cards. From a theoretical standpoint, our findings provide a possible explanation for why report cards are still poorly used by consumers. Further studies could work on the optimization of report cards by testing different formats using inferential statistics. Our findings can serve as a basis for such studies. Based on experiences from the United States and the United Kingdom, other countries (eg, Germany) adopted public reporting in institutional long-term care. This implies that similar strengths and weaknesses of web-based public reporting could be identified in those countries, which limits its ability to serve as a helpful decision-making tool for consumers. We encourage other countries to evaluate their current state of public reporting. This paper provides an extensive blueprint for such a reflection by highlighting elements of different areas of web-based public reporting that should be addressed.

Limitations

When interpreting and using the findings from this study, some aspects should be kept in mind. Although we updated our data last year, websites are subject to short-term changes at any time. Therefore, it is possible that the availability of websites and the status of functions and information on them today differ from those at the time of the data collection. In this study, we defined outcome quality as concrete objective indicators (eg, number of falls). This has resulted in a lack of outcome quality being identified in the United Kingdom. However, to avoid misinterpretation, it should be kept in mind that the United Kingdom addresses these types of facts in a different way (eg, safety). Furthermore, since it was not the subject of this study, we did not examine to what extent a type of quality information (eg, structural quality) was available on individual websites but only whether it was encountered at least once. We also did not evaluate in which format the quality information within a category was displayed (eg, numbers, graphs, or words). Further research could investigate these problems to give more specific recommendations for improvement for the managers of the identified report cards. In addition, we only included websites that allowed a nationwide search. Both countries, however, also have many report cards that are state-specific (eg, Aging and Disability Resource Connection of Oregon [74] and CQC [14]). As we assume that consumers should have the possibility to choose from a range of existing facilities, we decided to define our selection criteria this way. Thus, our findings are not generalizable to the whole field of web-based public reporting in long-term care, but only to those covering a nationwide search. Finally, we selected the United States and the United Kingdom due to their long tradition of public reporting and some decades of research on this topic, which is a strength of our study. The Netherlands, however, also has a lot of experience with public reporting [75] and would complement our research. Due to the risk of misinterpreted translations, we decided to focus on a smaller set of countries.

Conclusions

All in all, it seems that web-based public reporting in its current form cannot be a helpful decision-making tool for consumers. At present, users do not have the opportunity to assess the trustworthiness of information, the displayed quality information does not always correspond to their preferences, and users are not provided with enough functions to handle the complexity on report cards. Both countries, but especially the United Kingdom, should become more transparent about the source of data they use; especially in the United States, price transparency should improve. In both countries, more report cards should supplement the objective quality information with subjective consumer feedback. Report card managers in both countries should more intensively integrate the advanced search function and simplification tools on the websites and more often provide the opportunity to compare several potential facilities. These improvements could not only make the report cards a more helpful decision-making tool for users but also bring public reporting a bit closer to its goal of improving the quality of health care services.

Authors' Contributions

KK developed the idea, conceptualized methods, updated data collection and analysis, wrote the first draft, edited, and finalized the manuscript. SMO conceptualized methods, collected data, and carried out first data analyses. JK supported concept development and data analyses and assisted in the visualization of results. CBM gave supervision on methodology conception and manuscript writing. All authors reviewed the manuscript multiple times and gave their valuable critical feedback to improve the content of the manuscript.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Additional information on methodology and results.

PDF File (Adobe PDF File), 228 KB

  1. Organisation for Economic Co-operation and Development. Health at a Glance 2021: OECD Indicators. Paris. OECD Publishing; 2021.
  2. Kumpunen S, Trigg L, Rodrigues R. Public reporting in health and long-term care to facilitate provider choice. World Health Organization. 2014. URL: https://tinyurl.com/3vz2kx6z [accessed 2023-11-30]
  3. Rodrigues R, Trigg L, Schmidt AE, Leichsenring K. The public gets what the public wants: experiences of public reporting in long-term care in Europe. Health Policy. 2014;116(1):84-94. [FREE Full text] [CrossRef] [Medline]
  4. Berwick DM, James B, Coye MJ. Connections between quality measurement and improvement. Med Care. 2003;41(Suppl 1):I30-I38. [CrossRef] [Medline]
  5. Clement JP, Bazzoli GJ, Zhao M. Nursing home price and quality responses to publicly reported quality information. Health Serv Res. 2012;47(1 Pt 1):86-105. [FREE Full text] [CrossRef] [Medline]
  6. Werner RM, Norton EC, Konetzka RT, Polsky D. Do consumers respond to publicly reported quality information? evidence from nursing homes. J Health Econ. 2012;31(1):50-61. [FREE Full text] [CrossRef] [Medline]
  7. Marshall MN, Shekelle PG, Davies HTO, Smith PC. Public reporting on quality in the United States and the United Kingdom. Health Aff (Millwood). 2003;22(3):134-148. [CrossRef] [Medline]
  8. Marshall M, Shekelle P, Brook R, Leatherman S. Dying to Know: Public Release of Information About Quality of Health Care. Santa Monica, CA. Rand; 2000.
  9. Veillard J, Garcia-Armesto S, Kadandale S, Klazinga N, Leatherman S. International health system comparisons: from measurement challenge to management tool. In: Smith PC, Mossialos E, Papanicolas I, Leatherman S, editors. Performance Measurement for Health System Improvement. Cambridge. Cambridge University Press; 2010;641-672.
  10. Castle NG, Ferguson JC. What is nursing home quality and how is it measured? Gerontologist. 2010;50(4):426-442. [FREE Full text] [CrossRef] [Medline]
  11. Medicare.gov. URL: https://www.medicare.gov/ [accessed 2023-11-27]
  12. Cylus J, Richardson E, Findley L, Longley M, O'Neill CJ, Steel D. United Kingdom: health system review. Health Syst Transit. 2015;17(5):1-126. Copenhagen: WHO Regional Off. for Europe [FREE Full text] [Medline]
  13. Care Quality Commission inspection ratings. National Health Service. URL: https://www.nhs.uk/scorecard/8175 [accessed 2022-10-13]
  14. Care Quality Commission. URL: https://www.cqc.org.uk/ [accessed 2023-11-27]
  15. NHS Choices. NHS UK. URL: https://www.nhs.uk/ [accessed 2023-11-27]
  16. Independent review of adult social care—background paper: minutes and papers from the meeting of the Independent Review of Adult Social Care Group held on 1 October 2020. Healthcare Quality and Improvement Directorate. Care Inspectorate. 2020. URL: https://tinyurl.com/mr2zuybh [accessed 2022-10-13]
  17. Care Inspectorate. URL: https://www.careinspectorate.com/index.php/about-us [accessed 2022-10-13]
  18. A quality framework for care homes for adults and older people: For use in self-evaluation, scrutiny, and improvement support. Care Inspectorate. 2022. URL: https://tinyurl.com/yc5u2mt6 [accessed 2023-11-30]
  19. Care Inspectorate. URL: http://careinspectorate.com/ [accessed 2023-11-27]
  20. Jensdóttir AB, Rantz M, Hjaltadóttir I, Gudmundsdòttir H, Rook M, Grando V. International comparison of quality indicators in United States, Icelandic and Canadian nursing facilities. Int Nurs Rev. 2003;50(2):79-84. [CrossRef] [Medline]
  21. Nakrem S, Vinsnes AG, Harkless GE, Paulsen B, Seim A. Nursing sensitive quality indicators for nursing home care: international review of literature, policy and practice. Int J Nurs Stud. 2009;46(6):848-857. [FREE Full text] [CrossRef] [Medline]
  22. Trigg L, Kumpunen S, Holder J, Maarse H, Juvés MS, Gil J. Information and choice of residential care provider for older people: a comparative study in England, the Netherlands and Spain. Ageing Soc. 2017;38(6):1121-1147. [CrossRef]
  23. Hutchinson AM, Draper K, Sales AE. Public reporting of nursing home quality of care: lessons from the United States experience for Canadian policy discussion. Healthc Policy. 2009;5(2):87-105. [FREE Full text] [Medline]
  24. du Moulin MFMT, van Haastregt JCM, Hamers JPH. Monitoring quality of care in nursing homes and making information available for the general public: state of the art. Patient Educ Couns. 2010;78(3):288-296. [FREE Full text] [CrossRef] [Medline]
  25. Rechel B, McKee M, Haas M, Marchildon GP, Bousquet F, Blümel M, et al. Public reporting on quality, waiting times and patient experience in 11 high-income countries. Health Policy. 2016;120(4):377-383. [FREE Full text] [CrossRef] [Medline]
  26. Damman OC, van den Hengel YK, van Loon AJM, Rademakers J. An international comparison of web-based reporting about health care quality: content analysis. J Med Internet Res. 2010;12(2):e8. [FREE Full text] [CrossRef] [Medline]
  27. Emmert M, Maryschok M, Eisenreich S, Schöffski O. Websites to assess quality of care--appropriate to identify good physicians? Gesundheitswes. 2009;71(4):e18-e27. [CrossRef] [Medline]
  28. Search engine market share worldwide (Jan-Dec 2020). Statcounter. GlobalStats. 2022. URL: https://gs.statcounter.com/search-engine-market-share/all/worldwide/2020 [accessed 2022-10-13]
  29. CyberGhost VPN. CyberGhost. 2020. URL: https://www.cyberghostvpn.com/en_US/ [accessed 2022-10-13]
  30. Castle NG, Lowe TJ. Report cards and nursing homes. Gerontologist. 2005;45(1):48-67. [FREE Full text] [CrossRef] [Medline]
  31. Kast K, Emmert M, Maier CB. Public reporting on long-term care facilities in Germany: current state and evaluation of quality information. Gesundheitswes. 2021;83(10):809-817. [CrossRef] [Medline]
  32. Care homes. National Health Service. 2019. URL: https://tinyurl.com/e4cx75zy [accessed 2022-10-13]
  33. Adwords uncovered: how users perceive the new Google results page. Usability.de. 2016. URL: https://tinyurl.com/3ab2s7pj [accessed 2022-10-13]
  34. Petrescu P. Google Organic CTR-2014 Report. 2019. URL: https://www.advancedwebranking.com/blog/google-organic-ctr/ [accessed 2022-10-13]
  35. Emmert M, Becker S, Sander U. An international comparison of public reporting about the quality of hospitals: where do we stand and what can we learn? Health Econ Qual Manag. 2017;22(04):206-212. [CrossRef]
  36. World development indicators. The World Bank. 2020. URL: https://databank.worldbank.org/source/world-development-indicators [accessed 2022-10-13]
  37. QuickFacts. United States Census Bureau. 2019. URL: https://www.census.gov/quickfacts/fact/table/ [accessed 2022-10-13]
  38. 2024 best place to retire in the US.. U.S. News. URL: https://realestate.usnews.com/places/rankings/best-places-to-retire [accessed 2022-10-13]
  39. Internet use in the UK annual estimates by age, sex, disability and geographical location. Office for National Statistics. 2021. URL: https://tinyurl.com/nhjyr3v8 [accessed 2022-10-13]
  40. Kurtzman ET, Greene J. Effective presentation of health care performance information for consumer decision making: a systematic review. Patient Educ Couns. 2016;99(1):36-43. [FREE Full text] [CrossRef] [Medline]
  41. Donabedian A. The quality of care. How can it be assessed? JAMA. 1988;260(12):1743-1748. [CrossRef] [Medline]
  42. Lorini C, Porchia BR, Pieralli F, Bonaccorsi G. Process, structural, and outcome quality indicators of nutritional care in nursing homes: a systematic review. BMC Health Serv Res. 2018;18(1):43. [FREE Full text] [CrossRef] [Medline]
  43. Kolb B, Emmert M, Sander U, Patzelt C, Schöffski O. Do German public reporting websites provide information that office-based physicians consider before referring patients to hospital? a four-step analysis. Z Evid Fortbild Qual Gesundhwes. 2018;137-138:42-53. [FREE Full text] [CrossRef] [Medline]
  44. Emmert M, Sander U, Esslinger AS, Maryschok M, Schöffski O. Public reporting in Germany: the content of physician rating websites. Methods Inf Med. 2012;51(2):112-120. [CrossRef] [Medline]
  45. van Deursen AJAM, van Dijk JAGM. Using the internet: skill related problems in users' online behavior. Interact Comput. 2009;21(5-6):393-402. [CrossRef]
  46. Emmert M, Wiener M. What factors determine the intention to use hospital report cards? the perspectives of users and non-users. Patient Educ Couns. 2017;100(7):1394-1401. [FREE Full text] [CrossRef] [Medline]
  47. Sandmeyer B, Fraser I. New evidence on what works in effective public reporting. Health Serv Res. 2016;51(Suppl 2):1159-1166. [FREE Full text] [CrossRef] [Medline]
  48. Damberg CL, McNamara P. Postscript: research agenda to guide the next generation of public reports for consumers. Med Care Res Rev. 2014;71(Suppl 5):97S-107S. [CrossRef] [Medline]
  49. Kanouse DE, Schlesinger M, Shaller D, Martino SC, Rybowski L. How patient comments affect consumers' use of physician performance measures. Med Care. 2016;54(1):24-31. [FREE Full text] [CrossRef] [Medline]
  50. Konetzka RT, Perraillon MC. Use of nursing home compare website appears limited by lack of awareness and initial mistrust of the data. Health Aff (Millwood). 2016;35(4):706-713. [FREE Full text] [CrossRef] [Medline]
  51. Marshall MN, Hiscock J, Sibbald B. Attitudes to the public release of comparative information on the quality of general practice care: qualitative study. BMJ. 2002;325(7375):1278. [FREE Full text] [CrossRef] [Medline]
  52. Schneider EC, Lieberman T. Publicly disclosed information about the quality of health care: response of the US public. Qual Health Care. 2001;10(2):96-103. [FREE Full text] [CrossRef] [Medline]
  53. Marshall MN, Shekelle PG, Leatherman S, Brook RH. Public disclosure of performance data: learning from the US experience. Qual Health Care. 2000;9(1):53-57. [FREE Full text] [CrossRef] [Medline]
  54. Schmitz H, Stroka MA. Do elderly choose nursing homes by quality, price or location? In: Ruhr Economic Papers No. 495. Bochum. Ruhr University Bochum; 2014.
  55. Hibbard JH, Jewett JJ. Will quality report cards help consumers? Health Aff (Millwood). 1997;16(3):218-228. [CrossRef] [Medline]
  56. Hoffstedt C, Fredriksson M, Winblad U. How do people choose to be informed? a survey of the information searched for in the choice of primary care provider in Sweden. BMC Health Serv Res. 2021;21(1):559. [FREE Full text] [CrossRef] [Medline]
  57. Cost of care survey. Genworth. 2021. URL: https://www.genworth.com/aging-and-you/finances/cost-of-care.html [accessed 2022-10-13]
  58. Care home fees and costs: how much do you pay? carehome.co.uk. 2022. URL: https://www.carehome.co.uk/advice/care-home-fees-and-costs-how-much-do-you-pay [accessed 2022-10-13]
  59. Hefele JG, Li Y, Campbell L, Barooah A, Wang J. Nursing home Facebook reviews: who has them, and how do they relate to other measures of quality and experience? BMJ Qual Saf. 2018;27(2):130-139. [CrossRef] [Medline]
  60. Li Y, Cai X, Wang M. Social media ratings of nursing homes associated with experience of care and "nursing home compare" quality measures. BMC Health Serv Res. 2019;19(1):260. [FREE Full text] [CrossRef] [Medline]
  61. Johari K, Kellogg C, Vazquez K, Irvine K, Rahman A, Enguidanos S. Ratings game: an analysis of nursing home compare and yelp ratings. BMJ Qual Saf. 2018;27(8):619-624. [CrossRef] [Medline]
  62. Konetzka RT, Yan K, Werner RM. Two decades of nursing home compare: what have we learned? Med Care Res Rev. 2021;78(4):295-310. [FREE Full text] [CrossRef] [Medline]
  63. Nadash P, Hefele JG, Miller EA, Barooah A, Wang XJ. A national-level analysis of the relationship between nursing home satisfaction and quality. Res Aging. 2019;41(3):215-240. [CrossRef] [Medline]
  64. Powell J, Atherton H, Williams V, Mazanderani F, Dudhwala F, Woolgar S, et al. Using online patient feedback to improve NHS services: the INQUIRE multimethod study. Health Serv Deliv Res. 2019;7(38) ISBN:2050-4357. [CrossRef]
  65. Schlesinger M, Kanouse DE, Martino SC, Shaller D, Rybowski L. Complexity, public reporting, and choice of doctors: a look inside the blackest box of consumer behavior. Med Care Res Rev. 2014;71(Suppl 5):38S-64S. [FREE Full text] [CrossRef] [Medline]
  66. Reutskaja E, Lindner A, Nagel R, Andersen RA, Camerer CF. Choice overload reduces neural signatures of choice set value in dorsal striatum and anterior cingulate cortex. Nat Hum Behav. 2018;2(12):925-935. [CrossRef] [Medline]
  67. Sicilia M, Ruiz S, Munuera JL. Effects of interactivity in a web site: the moderating effect of need for cognition. J Advert. 2005;34(3):31-44. [CrossRef]
  68. Peters E, Dieckmann N, Dixon A, Hibbard JH, Mertz CK. Less is more in presenting quality information to consumers. Med Care Res Rev. 2007;64(2):169-190. [CrossRef] [Medline]
  69. Hibbard JH, Peters E, Slovic P, Finucane ML, Tusler M. Making health care quality reports easier to use. Jt Comm J Qual Improv. 2001;27(11):591-604. [CrossRef] [Medline]
  70. Schapira MM, Shea JA, Duey KA, Kleiman C, Werner RM. The nursing home compare report card: perceptions of residents and caregivers regarding quality ratings and nursing home choice. Health Serv Res. 2016;51(Suppl 2):1212-1228. [FREE Full text] [CrossRef] [Medline]
  71. Eysenbach G, Powell J, Kuss O, Sa ER. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review. JAMA. 2002;287(20):2691-2700. [CrossRef] [Medline]
  72. Gerteis M, Gerteis JS, Newman D, Koepke C. Testing consumers' comprehension of quality measures using alternative reporting formats. Health Care Financ Rev. 2007;28(3):31-45. [FREE Full text] [Medline]
  73. Castle NG. Searching for and selecting a nursing facility. Med Care Res Rev. 2003;60(2):223-247. [Medline]
  74. ADRC of Oregon. URL: https://www.oregon.gov/odhs/providers-partners/community-services-supports/pages/adrc.aspx [accessed 2023-11-28]
  75. Damman OC, Hendriks M, Rademakers J, Spreeuwenberg P, Delnoij DMJ, Groenewegen PP. Consumers' interpretation and use of comparative information on the quality of health care: the effect of presentation approaches. Health Expect. 2012;15(2):197-211. [FREE Full text] [CrossRef] [Medline]


CMS: Centers for Medicaid and Medicare
CQC: Care Quality Commission


Edited by A Mavragani; submitted 18.11.22; peer-reviewed by Y Li, U Sander; comments to author 03.01.23; revised version received 09.02.23; accepted 22.11.23; published 14.12.23.

Copyright

©Kristina Kast, Sara-Marie Otten, Jens Konopik, Claudia B Maier. Originally published in JMIR Formative Research (https://formative.jmir.org), 14.12.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.