Validation of the Short Assessment of Health Literacy in Portuguese-speaking Adults in Portugal

Validación del Short Assessment of Health Literacy in Portuguese-speaking adults en Portugal

Dagmara Paiva Susana Silva Milton Severo Pedro Moura-Ferreira Nuno Lunet Ana Azevedo About the authors

Abstract

Objective

To validate the Brazilian version of the Short Assessment of Health Literacy in Portuguese-speaking Adults (SAHLPA), a 50-item test proposed as a particularly helpful instrument to assess health literacy in people with limited skills, in the Portuguese population.

Methods

We used the standard procedure for cultural adaptation and administered the instrument to 249 participants. We examined construct validity using groups with expectedly increasing levels of health literacy (laypersons from the general population, engineering researchers, health researchers, and physicians), and through association with age and educational attainment, dichotomizing scores at the median of the layperson's group.

Results

Exploratory factor analysis revealed the instrument was one-dimensional and justified reduction to 33 items. SAHLPA-33 displayed adequate reliability (Cronbach's α = 0.73). The frequency of limited health literacy was highest among laypersons and lowest among physicians (p <0.001; p for trend <0.001). The proportion of participants with limited health literacy decreased with increasing education attainment (age- and sex-adjusted p for trend <0.001). Limited health literacy also tended to decrease with age, although the association was non-significant (sex- and education-adjusted p for trend = 0.067).

Conclusion

We culturally adapted a brief and simple instrument for health literacy assessment, and showed it was valid and fairly reliable. In Portuguese low-literate adults, SAHLPA-33 fills the gap in health literacy assessment instruments, and may be used to guide communication strategies with vulnerable patients and communities.

Keywords:
Health literacy; Validation studies; Portugal

Resumen

Objetivo

Validar la versión brasileña delShort Assessment of Health Literacy inPortuguese-speakingAdults(SAHLPA), una prueba de 50 ítems que ha sido propuesta como una herramienta particularmente útil para evaluar la alfabetización en salud en personas con bajas competencias, en la población portuguesa.

Métodos

Se usó el procedimiento habitual para la adaptación cultural. El instrumento fue administrado a 249 participantes. Se evaluó la validez de constructo utilizando grupos con niveles esperados crecientes de alfabetización en salud (personas no cualificadas de la población general, investigadores en el área de la ingeniería, investigadores en salud y médicos) y a través de la asociación con la edad y la escolaridad, dicotomizando las puntuaciones por la mediana de las del grupo de la población general.

Resultados

El análisis factorial exploratorio reveló que el instrumento era unidimensional y así ha sido reducido a 33 ítems. El SAHLPA-33 reveló una consistencia interna aceptable (α de Cronbach = 0,73). La frecuencia de alfabetización en salud limitada fue más elevada en la población general y menor en los médicos (p <0,001; p para la tendencia <0,001). La proporción de participantes con alfabetización en salud limitada disminuyó con el aumento de la escolaridad (p para la tendencia ajustada por edad y sexo <0,001). La alfabetización en salud también tendió a disminuir con la edad, aunque la asociación no era significativa (p para la tendencia ajustada por sexo y escolaridad = 0,067).

Conclusión

Se adaptó un instrumento simple y rápido para evaluar la alfabetización en salud individual y se mostró que era válido y razonablemente fiable. En los adultos portugueses con bajo nivel de alfabetización, SAHLPA-33 llena el vacío en instrumentos de evaluación de alfabetización en salud. Puede utilizarse para guiar estrategias de comunicación con personas y comunidades vulnerables.

Palabras clave:
Alfabetización en salud; Reproducibilidad de los resultados; Portugal

Introduction

Individual health literacy has been defined as “the degree to which people are able to access, understand, appraise and communicate information to engage with the demands of different health contexts to promote and maintain health across the life-course”.11. Kwan B, Frankish J, Rootman I. The development and validation of measures of "health literacy" in different populations - a report. Victoria, Canada: University of British Columbia Institute of Health Promotion Research and University of Victoria Community Health Promotion Research; 2006. p. 204. Limited health literacy has been linked to various adverse outcomes, including higher mortality, and is more common among the elderly, immigrants, and those with lower levels of education.22. Dewalt DA, Berkman ND, Sheridan S, et al. Literacy and health outcomes: a systematic review of the literature. J Gen Intern Med. 2004;19:1228-39.

3. Berkman ND, Sheridan SL, Donahue KE, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011;155:97-107.

4. Sørensen K, Pelikan JM, Rothlin F, et al. Health literacy in Europe: comparative results of the European health literacy survey (HLS-EU). Eur J Public Health. 2015;25:1053-8.
-55. Canadian Council on Learning. Health literacy in Canada: initial results from the International Adult Literacy and Skills Survey 2007. Ottawa, Ontario: Canadian Council on Learning; 2007. p. 34.

In the past three decades, numerous instruments have been developed to screen for limited individual health literacy in research or clinical settings.66. Haun JN, Valerio MA, McCormack LA, et al. Health literacy measurement: an inventory and descriptive summary of 51 instruments. J Health Commun. 2014;19 Suppl 2:302-33. The most widely used include the 66-item Rapid Estimate of Adult Literacy in Medicine77. Davis TC, Long SW, Jackson RH, et al. Rapid estimate of adult literacy in medicine: a shortened screening instrument. Fam Med. 1993;25:391-5. (REALM), and the full and short versions of the Test of Functional Health Literacy in Adults (TOFHLA88. Parker RM, Baker DW, Williams MV, et al. The test of functional health literacy in adults: a new instrument for measuring patients' literacy skills. J Gen Intern Med. 1995;10:537-41. and STOFHLA99. Baker DW, Williams MV, Parker RM, et al. Development of a brief test to measure functional health literacy. Patient Educ Couns. 1999;38:33-42.).33. Berkman ND, Sheridan SL, Donahue KE, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011;155:97-107. Most of them were originally developed in English and are being adapted to other languages and populations.66. Haun JN, Valerio MA, McCormack LA, et al. Health literacy measurement: an inventory and descriptive summary of 51 instruments. J Health Commun. 2014;19 Suppl 2:302-33. The REALM1010. Davis TC, Crouch MA, Long SW, et al. Rapid assessment of literacy levels of adult primary care patients. Fam Med. 1991;23:433-5. is a 125-item instrument developed as a fast screening tool to identify patients with limited abilities to read common medical and lay terms for body parts and illnesses. It presents words in ascending order of difficulty and is based on the idea that patients having trouble reading and pronouncing words probably will have issues with reading comprehension. The most commonly used is the reduced 66-item version77. Davis TC, Long SW, Jackson RH, et al. Rapid estimate of adult literacy in medicine: a shortened screening instrument. Fam Med. 1993;25:391-5. that is frequently used to estimate patient reading levels (converting raw scores into grade equivalents) and tailor communication with patients accordingly.

In languages with very high letter to sound (phoneme-grapheme) correspondence, such as Spanish and Portuguese, the adaptation of health literacy assessment instruments based on word recognition and pronunciation, such as the REALM, is hindered by their inability to discriminate between health literacy and ability to read.1111. Nurss JR, Baker DW, Davis TC, et al. Difficulties in functional health literacy screening in Spanish-speaking adults. Journal of Reading. 1995;38:632-7.-1212. Paiva D, Silva S, Severo M, et al. Cross-cultural adaptation and validation of the health literacy assessment tool METER in the Portuguese adult population. Patient Educ Couns. 2014;97:269-75. The Short Assessment of Health Literacy for Spanish-speaking Adults (SAHLSA)1313. Lee S-YD, Bender DE, Ruiz RE, et al. Development of an easy-to-use Spanish health literacy test. Health Serv Res. 2006;41:1392-412. was designed to overcome this issue by incorporating word comprehension. It has been adapted to Portuguese and validated in the Brazilian population as the Short Assessment of Health Literacy for Portuguese-speaking Adults (SAHLPA).1414. Apolinario D, Braga RC, Magaldi RM, et al. Short Assessment of Health Literacy for Portuguese-speaking Adults. Rev Saude Publica. 2012;46:702-11. These instruments have been proposed as less intimidating alternatives to assess health literacy in a clinical setting, and particularly helpful in assessing health literacy in the population groups most vulnerable to limited health literacy.1313. Lee S-YD, Bender DE, Ruiz RE, et al. Development of an easy-to-use Spanish health literacy test. Health Serv Res. 2006;41:1392-412.

In Portugal, limited health literacy has been estimated to affect between 491515. Espanha R, Ávila P, Veloso Mendes R. Literacia em saúde em Portugal. Lisboa: Fundação Calouste Gulbenkian; 2016. p. 96. and 73%1616. Paiva D, Silva S, Severo M, et al. Limited health literacy in Portugal assessed with the Newest Vital Sign. Acta Med Port. 2017;12:861-9. of the population. There is a lack of health literacy instruments designed specifically for low-literate populations that can be used to tailor health education interventions, as well as to study the impact of this social determinant of health.1717. Storms H, Claes N, Aertgeerts B, et al. Measuring health literacy among low literate people: an exploratory feasibility study with the HLS-EU questionnaire. BMC Public Health. 2017;17:475. Because of its characteristics, brevity and ease of administration, we aimed to culturally adapt and validate SAHLPA in the Portuguese population.

Methods

Original instrument

The SAHLPA is the Brazilian adapted version of the SAHLSA. SAHLSA is a new instrument based on the 66-item REALM1010. Davis TC, Crouch MA, Long SW, et al. Rapid assessment of literacy levels of adult primary care patients. Fam Med. 1991;23:433-5. supplemented by a simple comprehension test. An expert panel using the Delphi method developed two simple terms to match each REALM medical term: a key (a word with similar meaning) and a distractor (a word unrelated to the medical term). The resulting instrument consists of 50 medical terms the participants are requested to read aloud and associate with one of two word options. Participants are shown 50 laminated flash cards, each with a medical term in boldface on top and a key and distractor at the bottom. Because the key and distractor are used to test comprehension, participants are asked not to guess and to answer “Don't know” if they don't know the correct association. To answer correctly, the participants must both correctly pronounce the medical term and match it to the key. The score is calculated as the sum of all correct answers and varies between 0 and 50. It was validated in a convenience sample of 201 Spanish-speaking adults living in the United States. It takes 3-6minutes to complete.

SAHLPA was validated in a convenience sample of 226 Brazilian adults over the age of 60. Construct validity was assessed through correlation with formal education, self-reported functional literacy and global cognitive testing. The cut-off point for inadequate health literacy was defined by the inability to fully understand a medical prescription by a sub-sample of the participants and was ≤42 for the 50-item version and ≤14 for the short version (SAHLPA-18). Both the full (50-items) and reduced versions (18-items) showed good psychometric properties (Cronbach's α = 0.93 and 0.90, respectively) and high correlation (>0.60) with the variables used for construct validity testing. The full version takes 3-6minutes to administer and the short one 1-2minutes.

Cultural adaptation of SAHLPA to European Portuguese

We used the standard procedure for instrument adaptation to other populations. 1818. Guillemin F, Bombardier C, Beaton D. Cross-cultural adaptation of health-related quality of life measures: literature review and proposed guidelines. J Clin Epidemiol. 1993;46:1417-32.An expert committee (with backgrounds in family medicine, internal medicine, pharmacy, psychology, and sociology) culturally adapted the Brazilian Portuguese SAHLPA into European Portuguese, ensuring semantic and item equivalence. To preserve semantic equivalence, some words were altered: “recreação” was replaced by “lazer”, “similar” by “semelhante”, “matrimônio” by “casamento”, “coceira” by “coçar”, “tranquilo” by “calmo”. Other words were changed to match the correct spelling used in Portugal and accommodate spelling differences between Brazil and the other Portuguese speaking countries: “estresse” was replaced by “stress”, “Papanicolaou” by “Papanicolau”, “dolorido” by “dorido” and “contraceptivo” by “contracetivo” (Table 1). Items were otherwise considered culturally and socially equivalent. In addition, two native Portuguese speakers proficiently fluent in Spanish translated SAHLSA independently and merged the translations into a single European Portuguese version. Next, two native Spanish speakers, proficient in Portuguese, independently back-translated this version. They arrived at a consensus back-translated version, which was then revised and compared to the original by the committee, resolving any discrepancies between the two versions. This translated second European Portuguese version was then compared with the Brazilian one and with the first translation to European Portuguese. No additional changes were made. Because of word pronunciation differences between regions in Portugal, all but overtly inappropriate accents (e.g. ignoring written accents) were accepted as correct.

Table 1.
Correct answers per item and standardized factor loadings in exploratory factor analysis.

A pilot version was administered to a sample of six people (that included men and women between the ages of 15 and 65) and the instructions wording was adjusted for clarity.

Sample and recruitment

The adapted version of the instrument was administered to a convenience sample of 249 people, as part of a validation study of individual health literacy instruments in the Portuguese population.1212. Paiva D, Silva S, Severo M, et al. Cross-cultural adaptation and validation of the health literacy assessment tool METER in the Portuguese adult population. Patient Educ Couns. 2014;97:269-75. Participants were recruited from four different groups: physicians from public hospitals and primary care health centres (n = 53), health researchers from a research institute in public health (n = 45), researchers from areas unrelated to health from an engineering faculty (n = 50), and laypersons from the general population users of a primary care health centre (n = 101). We followed the administration instructions of the original instrument, i.e., participants were shown the laminated flash cards by a trained interviewer and were asked to read the bolded term out loud and to choose the associated term from the bottom two options.

Eligibility criteria for the participants were age over 18 years and ability to speak and read Portuguese. Potential participants with impaired vision were excluded.

Statistical analysis

Participant characteristics are described using frequencies and median [25th-75th percentiles (P25-P75)] as appropriate, by validation group, for sex and age, and compared across the groups using the χ2 test for sex and the Kruskal-Wallis for age.

Exploratory factor analysis (by common factor analysis) was performed on the 50 items and visual analysis of the scree plot was used to evaluate homogeneity (i.e., to verify there was a single latent factor measuring reading skills and comprehension). An item was considered to load in a certain factor when it showed an absolute factor loading higher than 0.5. Items with clear ceiling effects (100% participants answering correctly) and items with loadings <0.5 were removed from the instrument. Cronbach's alpha with 95% confidence interval (95%CI) was used to measure internal consistency. Physicians were excluded from these analyses, since they are not part of the target population of the instrument. The global goodness of fit of the underlying model was evaluated using the comparative fit index (CFI) recommended for sample sizes below 250, and the root mean square error of approximation (RMSEA) with 90%CI. We considered the model had good fit when the CFI was higher than 0.95 and the RMSEA was lower than 0.06.1919. Hu L, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Modeling. 1999;6:1-55.

To assess construct validity we assumed that physicians would score highest on the health literacy test, followed by health researchers, people with a similar academic degree in areas unrelated to health, and finally by laypersons from the general population. Raw scores were compared across these validation groups with the Kruskal-Wallis test, complemented by pairwise comparisons, with a Bonferroni correction to adjust p-values for multiple comparisons. To further test construct validity, SAHLPA scores were also dichotomized at the median of the laypersons subsample into adequate health literacy (scores at or above the median) and limited health literacy. Fisher's exact test was used to compare proportions of limited health literacy across validation groups, with a test for linear trend. Logistic regression was used to calculate odds ratios (OR) and 95%CI to compare the odds of limited health literacy across age and education groups. Physicians were excluded from the regression analyses, since the instrument was not developed to assess them. A sensitivity analysis was performed restricting the regression analysis to the laypersons subsample. Two-sided p values less than 0.05 were considered to define a statistically significant result.

Exploratory factor analysis models were fitted using MPlus (V.5.2; Muthen & Muthen, Los Angeles, California, USA). All other statistical analyses were performed using STATA11®.

Ethics review and consent

The present investigation was carried out in accordance with the Code of Ethics of World Medical Association and the Declaration of Helsinki, and approved by the Ethics Committee of Centro Hospitalar de São João and the National Committee for Data Protection. Both the authors of SAHLSA and SAHLPA authorised the adaptation and validation of the instrument in the Portuguese population. Each participant provided written informed consent.

Results

Characteristics of the sample are summarized in Table 2. Women made up the majority of respondents in all validation groups except for the group of engineering researchers (p <0.001). Engineering researchers and laypersons from the general population were older (p <0.001).

Table 2.
Characteristics of the sample by validation group.

The scree plot curve inflected at the first component, revealing a single dimension of the instrument (Fig. 1). This dimension explained 44,4% of the total variance. The global fit of the underlying model was good (CFI = 0.97 and RMSEA = 0.037; 90%CI: 0.030-0.043). Two items (SAHLPA 5 e 7) were removed because of a ceiling effect and 15 items because they had factor loadings below 0.5 (Table 1). The final version contained 33 items (SAHLPA-33) (see Appendix online to this article). SAHLPA-33 showed an adequate degree of reliability, with a Cronbach's alpha of 0.73 (95%CI: 0.68-0.78).

Figure 1.
Scree plot of eigenvalues after exploratory factor analysis.

The distributions of SAHLPA-33 scores were left skewed and with positive kurtosis in all validation groups, but with different distribution shapes (Fig. 2). The scores ranged from 24 to 33 in the general population subsample, 31 to 33 for researchers and 32 to 33 for physicians.

Figure 2.
SAHLPA-33 score distribution by validation group.

There was a statistically significant difference in mean ranks of SAHLPA-33 scores between the four validation groups (p <0.001), with the group of laypersons from the general population exhibiting a lower mean rank of scores than the other groups (all p <0.001), and the group of engineering researchers showing a lower rank of scores than physicians (p = 0.042).

Using health literacy as a binary variable, our data revealed evidence of an association between limited health literacy and validation group (Fisher's test p <0.001; p for trend of the original hypothesis <0.001). In regression analyses, limited health literacy was less common with increasing age, although not significantly (Table 3). There was a negative association between limited health literacy and education attainment (p for trend <0.001). The strongest association was observed for people with education attainment above the twelfth grade; they were significantly less likely to have limited health literacy when compared to people with education attainment below the ninth grade (sex and age-adjusted OR = 0.05; 95%CI: 0.02-0.15).

Table 3.
Odds ratios and 95% confidence intervals (95%CI) for the association between sample characteristics and limited health literacy.

When considering only the subsample of laypersons, results were similar: there was a significant negative association between limited health literacy and education attainment (p for trend = 0.001) and no significant association with age, although the direction of the association was the same.

Discussion

We adapted a brief and simple health literacy instrument to European Portuguese, and showed that it was valid and fairly reliable in the Portuguese population. Regarding construct validity, health literacy was significantly associated with health occupation and higher education attainment.

Our results revealed an evident left skew and positive kurtosis in the SAHLPA scores. This asymmetry in scores distribution was more pronounced in our study than in the Brazilian one, which may be explained by the use of a more diverse and literate sample: the average score of the 50-item SAHLPA in our sample was 6 points higher than that found in the Brazilian study, even when considering only the laypersons subsample (43.8; standard deviation [SD] = 4.4 vs. 37.7; SD = 9.0), and 9 points higher when considering the whole sample (46.7; SD = 3.8).1414. Apolinario D, Braga RC, Magaldi RM, et al. Short Assessment of Health Literacy for Portuguese-speaking Adults. Rev Saude Publica. 2012;46:702-11. Furthermore, validation of SAHLPA in the Brazilian population was restricted to patients over 60 years old (mean 74.4 years) and a quarter of the sample (25.7%) had less than 4 years of schooling.1414. Apolinario D, Braga RC, Magaldi RM, et al. Short Assessment of Health Literacy for Portuguese-speaking Adults. Rev Saude Publica. 2012;46:702-11. In contrast, our sample included participants between 18 and 86 years (median 38.5 years), and only 14.8% had less than 4 years of schooling. Hence, our study design oversampled people with higher health literacy, pushing scores to the upper end of the scale.

Our findings show that SAHLPA-33 is fairly reliable. The lower internal consistency (Cronbach's alpha = 0.73), when compared to that of the Brazilian SAHLPA-18 version (Cronbach's alpha = 0.90), could be explained by the lower variability in score distributions, that is known to underestimate the reliability.2020. Sheng Y, Sheng Z. Is coefficient alpha robust to non-normal data? Front Psychol. 2012;3:34.

Although two different screening instruments previously validated in the Portuguese population were available,1212. Paiva D, Silva S, Severo M, et al. Cross-cultural adaptation and validation of the health literacy assessment tool METER in the Portuguese adult population. Patient Educ Couns. 2014;97:269-75.,1616. Paiva D, Silva S, Severo M, et al. Limited health literacy in Portugal assessed with the Newest Vital Sign. Acta Med Port. 2017;12:861-9. we decided not to test concurrent validity, because neither of them is considered a gold standard in health literacy assessment. Instead, our strategy relied on examining known-groups validity, that is, administering the instrument to different groups that logically should have different levels of the construct to confirm whether the hypothesized difference was reflected in the scores of the groups.2121. Davidson M. Known-groups validity. In: Michalos AC, editor. Encyclopedia of quality of life and well-being research. Dordrecht: Springer Netherlands; 2014. p. 3481-2. Thus, we assumed health literacy would decrease across groups with progressively lower familiarity obtaining and processing health information, in the following order: physicians, health researchers, engineering researchers, laypersons from the general population. Although our data showed a significant trend (p <0.001), the instrument was better at discriminating people in the lower range of the health literacy spectrum, as it was designed to do.1010. Davis TC, Crouch MA, Long SW, et al. Rapid assessment of literacy levels of adult primary care patients. Fam Med. 1991;23:433-5.,1313. Lee S-YD, Bender DE, Ruiz RE, et al. Development of an easy-to-use Spanish health literacy test. Health Serv Res. 2006;41:1392-412.

Less educated people tended to have lower health literacy, in accordance with results from previous studies.2222. Institute of Medicine (US). Committee on Health Literacy. Nielsen-Bohlman L, Panzer AM, Kindig DA, editors. Health literacy: a prescription to end confusion. Washington (DC): National Academies Press (US); 2004.

We were not able to find a significant association between limited health literacy and age. The magnitude and direction of this association appears to vary according to the type of assessment instrument used. A recent systematic review found that limited health literacy, when assessed using instruments based on medical vocabulary, such as the REALM (the precursor of SAHLPA), only weakly associates with older age.2323. Kobayashi LC, Wardle J, Wolf MS, et al. Aging and functional health literacy: a systematic review and meta-analysis. J Gerontol B Psychol Sci Soc Sci. 2016;71:445-57. Instruments based on reading comprehension, reasoning, and numeracy skills, such as the NVS or the TOFHLA, in contrast, usually reveal positive associations between limited health literacy and age. The authors argue that crystallized cognitive abilities, such as the ones involved in word recognition and pronunciation, are not affected by aging-related decline, as opposed to those requiring fluid cognitive abilities, more related to reasoning and problem solving. In addition, it is also plausible that as people age and become more exposed to healthcare, their medical vocabulary increases, altering the traditional direction of the association between limited health literacy and age.2424. Barber MN, Staples M, Osborne RH, et al. Up to a quarter of the Australian population may have suboptimal health literacy depending upon the measurement tool: results from a population-based survey. Health Promot Int. 2009;24:252-61. Arguably, our study was underpowered to detect this association.

Some limitations are worth pointing out. SAHLPA is based on the REALM, a test popularly used to assess health literacy, but centred on reading skills. In fact, the REALM was not designed to assess health literacy but to estimate patient reading levels.77. Davis TC, Long SW, Jackson RH, et al. Rapid estimate of adult literacy in medicine: a shortened screening instrument. Fam Med. 1993;25:391-5. Some authors have suggested that REALM scores should be treated as a correlate or predictor of health literacy and not as a measure of health literacy per se, because the instrument lacks coverage on three primary content areas of health literacy: comprehension, numeracy, and information seeking/navigation.2525. Dumenci L, Matsuyama RK, Kuhn L, et al. On the validity of the Rapid Estimate of Adult Literacy in Medicine (REALM) scale as a measure of health literacy. Commun Methods Meas. 2013;7:134-43. SAHLSA and SAHLPA on the other hand, are seen as new instruments2626. Jordan JE, Osborne RH, Buchbinder R. Critical appraisal of health literacy indices revealed variable underlying constructs, narrow content and psychometric weaknesses. J Clin Epidemiol. 2011;64:366-79. because they include comprehension of written health materials and thus have better content validity than the REALM. In addition, as is the case with other instruments that directly test individual abilities, they also do not take into account the abilities to interact, communicate or apply critical thinking, which are now included in definitions of health literacy.2727. Nouri SS, Rudd RE. Health literacy in the "oral exchange": an important element of patient-provider communication. Patient Educ Couns. 2015;98: 565-71.-2828. Nguyen TH, Paasche-Orlow MK, McCormack LA. The state of the science of health literacy measurement. Stud Health Technol Inform. 2017;240:17-33. According to more recent guidelines, for adequate assessment of the structural validity (the degree to which scores are an adequate reflection of the dimensionality of the construct to be measured), the recommended sample size should be of at least five participants per item.2929. Mokkink LB, de Vet HCW, Prinsen CAC, et al. COSMIN risk of bias checklist for systematic reviews of patient-reported outcome measures. Quality of Life Research. 2018;27:1171-9. Although we had only 3.92 participants per item, based on theory behind the instrument development and on the high bi-serial correlations between the items, it is highly unlikely the instrument could be multidimensional and assess other health literacy sub-dimensions in addition to the word comprehension sub-dimension.1313. Lee S-YD, Bender DE, Ruiz RE, et al. Development of an easy-to-use Spanish health literacy test. Health Serv Res. 2006;41:1392-412.-1414. Apolinario D, Braga RC, Magaldi RM, et al. Short Assessment of Health Literacy for Portuguese-speaking Adults. Rev Saude Publica. 2012;46:702-11.,3030. Wolf EJ, Harrington KM, Clark SL, et al. Sample size requirements for structural equation models: an evaluation of power, bias, and solution propriety. Educational and Psychological Measurement. 2013;73:913-34. We did not examine test-retest reliability and future studies using SAHLPA in less literate samples should determine it. They should also help determine an appropriate cut-off to use health literacy as a binary variable. Future studies should also investigate the relationship between health literacy and gender, as there is a known gender gap in information seeking behaviour, i.e., women are more likely to engage in information seeking than men.3131. Manierre MJ. Gaps in knowledge: tracking and explaining gender differences in health information seeking. Soc Sci Med. 2015;128:151-8.

An instrument based on the Brazilian SAHLPA-18 has been recently validated in the Portuguese population, adding five items to the shortened instrument, all of them drug-related.3232. Pires C, Rosa P, Vigario M, et al. Short Assessment of Health Literacy (SAHL) in Portugal: development and validation of a self-administered tool. Prim Health Care Res Dev. 2018:1-18. It is undetermined if the addition of these items significantly increased the difficulty of the instrument, rendering it less appropriate for less literate samples. SAHLPA-33 on the other hand, when compared with two other health literacy measurement instruments (the Newest Vital Sign1616. Paiva D, Silva S, Severo M, et al. Limited health literacy in Portugal assessed with the Newest Vital Sign. Acta Med Port. 2017;12:861-9. and METER1212. Paiva D, Silva S, Severo M, et al. Cross-cultural adaptation and validation of the health literacy assessment tool METER in the Portuguese adult population. Patient Educ Couns. 2014;97:269-75.) using item response theory has shown to have better discrimination and precision at lower levels of respondent ability (unpublished manuscript). Future studies should compare the factor structure between Brazilian and Portuguese populations, with both confirmatory factor analysis and differential item functioning. This comparison should also be done with the abovementioned Portuguese version.

SAHLPA-33 fills the gap in health literacy assessment instruments for Portuguese low-literate adults. In contrast to instruments based on self-assessment questions and more vulnerable to non-response bias,1717. Storms H, Claes N, Aertgeerts B, et al. Measuring health literacy among low literate people: an exploratory feasibility study with the HLS-EU questionnaire. BMC Public Health. 2017;17:475. it offers an objective way to assess health literacy in this vulnerable group. National and international policies now recognize health literacy as a crucial determinant of health and are focusing on strategies to improve it.3333. Portuguese Government. Despacho n. 3618-A/2016 [Order no. 3618-A/2016], Portuguese Government. Gabinete do Secretário de Estado Adjunto e da Saúde [Office of the State Secretary to the MInister of Health]. Diário da República, 2nd Series (Mar. 14, 2016).

34. U.S. Department of Health and Human Services. Office of Disease Prevention and Health Promotion. National action plan to improve health literacy. Washington, DC: U.S. Department of Health and Human Services; 2010.
-3535. United Nations Economic and Social Council. Ministerial Declaration - 2009 high level segment: implementing the internationally agreed goals and commitments in regard to global public health. Geneva, Switzerland: United Nations Economic and Social Council; 2009. p. 7. Although this instrument does not assess the health literacy demands imposed on individuals or the resources available to individuals and communities, i.e. their distributed health literacy,3636. Batterham RW, Hawkins M, Collins PA, et al. Health literacy: applying current concepts to improve health services and reduce health inequalities. Public Health. 2016;132:3-12. health literacy research in Portugal is very recent and brief assessment instruments are still useful to increase awareness and advance the field. We hope that the SAHLPA-33 can help support policy makers and clinicians in providing more effective health education, specifically targeted to low health literacy adults.

Conclusion

We have adapted a brief and simple instrument to assess health literacy in the Portuguese population. Future studies with less literate samples are needed to supplement and improve on this validation, before SAHLPA-33 is used to explore associations with health outcomes and to guide health interventions, especially in less literate populations. A cross-cultural validation should also be performed to allow comparisons between Brazilian and Portuguese samples, using SAHLPA-18 and SAHLPA-33. In addition, we recommend complementing it with instruments covering other dimensions of the health literacy construct: access, communication, and critical appraisal of health information to make decisions.

What is known about the topic?

Limited health literacy has been linked to more difficult access to care, increased costs and poorer clinical outcomes. Assessing health literacy directly can enable providers and health organisations to target interventions that improve the health literacy of those with lower health literacy and ultimately their health outcomes.

What does this study add to the literature?

SAHLPA may fill the gap in brief health literacy assessment for people with low health literacy in Portugal. Studies with less literate samples are needed to supplement and improve on this validation.

Acknowledgements

We are grateful to each of the participants and to the institutions Faculdade de Engenharia da Universidade do Porto, Instituto de Saúde Pública da Universidade do Porto and Unidade de Saúde Familiar Monte Murado, for enabling participant recruitment.

References

  • 1
    Kwan B, Frankish J, Rootman I. The development and validation of measures of "health literacy" in different populations - a report. Victoria, Canada: University of British Columbia Institute of Health Promotion Research and University of Victoria Community Health Promotion Research; 2006. p. 204.
  • 2
    Dewalt DA, Berkman ND, Sheridan S, et al. Literacy and health outcomes: a systematic review of the literature. J Gen Intern Med. 2004;19:1228-39.
  • 3
    Berkman ND, Sheridan SL, Donahue KE, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011;155:97-107.
  • 4
    Sørensen K, Pelikan JM, Rothlin F, et al. Health literacy in Europe: comparative results of the European health literacy survey (HLS-EU). Eur J Public Health. 2015;25:1053-8.
  • 5
    Canadian Council on Learning. Health literacy in Canada: initial results from the International Adult Literacy and Skills Survey 2007. Ottawa, Ontario: Canadian Council on Learning; 2007. p. 34.
  • 6
    Haun JN, Valerio MA, McCormack LA, et al. Health literacy measurement: an inventory and descriptive summary of 51 instruments. J Health Commun. 2014;19 Suppl 2:302-33.
  • 7
    Davis TC, Long SW, Jackson RH, et al. Rapid estimate of adult literacy in medicine: a shortened screening instrument. Fam Med. 1993;25:391-5.
  • 8
    Parker RM, Baker DW, Williams MV, et al. The test of functional health literacy in adults: a new instrument for measuring patients' literacy skills. J Gen Intern Med. 1995;10:537-41.
  • 9
    Baker DW, Williams MV, Parker RM, et al. Development of a brief test to measure functional health literacy. Patient Educ Couns. 1999;38:33-42.
  • 10
    Davis TC, Crouch MA, Long SW, et al. Rapid assessment of literacy levels of adult primary care patients. Fam Med. 1991;23:433-5.
  • 11
    Nurss JR, Baker DW, Davis TC, et al. Difficulties in functional health literacy screening in Spanish-speaking adults. Journal of Reading. 1995;38:632-7.
  • 12
    Paiva D, Silva S, Severo M, et al. Cross-cultural adaptation and validation of the health literacy assessment tool METER in the Portuguese adult population. Patient Educ Couns. 2014;97:269-75.
  • 13
    Lee S-YD, Bender DE, Ruiz RE, et al. Development of an easy-to-use Spanish health literacy test. Health Serv Res. 2006;41:1392-412.
  • 14
    Apolinario D, Braga RC, Magaldi RM, et al. Short Assessment of Health Literacy for Portuguese-speaking Adults. Rev Saude Publica. 2012;46:702-11.
  • 15
    Espanha R, Ávila P, Veloso Mendes R. Literacia em saúde em Portugal. Lisboa: Fundação Calouste Gulbenkian; 2016. p. 96.
  • 16
    Paiva D, Silva S, Severo M, et al. Limited health literacy in Portugal assessed with the Newest Vital Sign. Acta Med Port. 2017;12:861-9.
  • 17
    Storms H, Claes N, Aertgeerts B, et al. Measuring health literacy among low literate people: an exploratory feasibility study with the HLS-EU questionnaire. BMC Public Health. 2017;17:475.
  • 18
    Guillemin F, Bombardier C, Beaton D. Cross-cultural adaptation of health-related quality of life measures: literature review and proposed guidelines. J Clin Epidemiol. 1993;46:1417-32.
  • 19
    Hu L, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Modeling. 1999;6:1-55.
  • 20
    Sheng Y, Sheng Z. Is coefficient alpha robust to non-normal data? Front Psychol. 2012;3:34.
  • 21
    Davidson M. Known-groups validity. In: Michalos AC, editor. Encyclopedia of quality of life and well-being research. Dordrecht: Springer Netherlands; 2014. p. 3481-2.
  • 22
    Institute of Medicine (US). Committee on Health Literacy. Nielsen-Bohlman L, Panzer AM, Kindig DA, editors. Health literacy: a prescription to end confusion. Washington (DC): National Academies Press (US); 2004.
  • 23
    Kobayashi LC, Wardle J, Wolf MS, et al. Aging and functional health literacy: a systematic review and meta-analysis. J Gerontol B Psychol Sci Soc Sci. 2016;71:445-57.
  • 24
    Barber MN, Staples M, Osborne RH, et al. Up to a quarter of the Australian population may have suboptimal health literacy depending upon the measurement tool: results from a population-based survey. Health Promot Int. 2009;24:252-61.
  • 25
    Dumenci L, Matsuyama RK, Kuhn L, et al. On the validity of the Rapid Estimate of Adult Literacy in Medicine (REALM) scale as a measure of health literacy. Commun Methods Meas. 2013;7:134-43.
  • 26
    Jordan JE, Osborne RH, Buchbinder R. Critical appraisal of health literacy indices revealed variable underlying constructs, narrow content and psychometric weaknesses. J Clin Epidemiol. 2011;64:366-79.
  • 27
    Nouri SS, Rudd RE. Health literacy in the "oral exchange": an important element of patient-provider communication. Patient Educ Couns. 2015;98: 565-71.
  • 28
    Nguyen TH, Paasche-Orlow MK, McCormack LA. The state of the science of health literacy measurement. Stud Health Technol Inform. 2017;240:17-33.
  • 29
    Mokkink LB, de Vet HCW, Prinsen CAC, et al. COSMIN risk of bias checklist for systematic reviews of patient-reported outcome measures. Quality of Life Research. 2018;27:1171-9.
  • 30
    Wolf EJ, Harrington KM, Clark SL, et al. Sample size requirements for structural equation models: an evaluation of power, bias, and solution propriety. Educational and Psychological Measurement. 2013;73:913-34.
  • 31
    Manierre MJ. Gaps in knowledge: tracking and explaining gender differences in health information seeking. Soc Sci Med. 2015;128:151-8.
  • 32
    Pires C, Rosa P, Vigario M, et al. Short Assessment of Health Literacy (SAHL) in Portugal: development and validation of a self-administered tool. Prim Health Care Res Dev. 2018:1-18.
  • 33
    Portuguese Government. Despacho n. 3618-A/2016 [Order no. 3618-A/2016], Portuguese Government. Gabinete do Secretário de Estado Adjunto e da Saúde [Office of the State Secretary to the MInister of Health]. Diário da República, 2nd Series (Mar. 14, 2016).
  • 34
    U.S. Department of Health and Human Services. Office of Disease Prevention and Health Promotion. National action plan to improve health literacy. Washington, DC: U.S. Department of Health and Human Services; 2010.
  • 35
    United Nations Economic and Social Council. Ministerial Declaration - 2009 high level segment: implementing the internationally agreed goals and commitments in regard to global public health. Geneva, Switzerland: United Nations Economic and Social Council; 2009. p. 7.
  • 36
    Batterham RW, Hawkins M, Collins PA, et al. Health literacy: applying current concepts to improve health services and reduce health inequalities. Public Health. 2016;132:3-12.

  • Funding

    This study was funded by FEDER through the Operational Programme Competitiveness and Internationalization and national funding from the Foundation for Science and Technology - FCT (Portuguese Ministry of Science, Technology and Higher Education) to the Unidade de Investigação em Epidemiologia - Instituto de Saúde Pública da Universidade do Porto (EPIUnit) (POCI-01-0145-FEDER-006862; Ref. UID/DTP/04750/2013); and the FCT Investigator contract IF/01674/2015 (to SS).

Publication Dates

  • Publication in this collection
    03 Mar 2021
  • Date of issue
    Sep-Oct 2020

History

  • Received
    17 Oct 2018
  • Accepted
    12 Mar 2019
  • Published
    31 May 2019
Sociedad Española de Salud Pública y Administración Sanitaria (SESPAS) Barcelona - Barcelona - Spain
E-mail: gs@elsevier.com