Whilst online research allows us to cross borders with ease, conducing cross-cultural research requires many considerations. There can be vast differences in how people respond to survey questions across different countries and cultures. Because of this, designing international surveys require some extra care.
How and why do survey responses differ cross-culturally?
People answer surveys in different ways around the world. The way respondents answer surveys varies significantly between countries. This is due to a range of cultural and methodological factors that you should consider when planning any cross-cultural study.
For example, levels of agreeability range across countries dramatically. In India for example, 75% of answers are on the positive end of range scales, on average, whereas in Japan only 11% of answers are positive.
The underlying reasons for this are quite complex and driven by psychological factors, but they are compounded by several issues, including:
Language and word association
The same words can be interpreted differently in different countries. The word ‘love’ to describe feelings towards a brand is far more likely to be selected by Americans than British or Australian. Their use of language is less emotional. Respondents in UK and Australia are more likely to use the world ‘OK’ in association with a brand. However, to Americans, ‘OK’ is too ordinary.
Number ranges
In most countries people tend to think about 10 being the highest and 1 being the lowest score you can give when rating things. The opposite can occur in some countries like China and Germany. In China, 1 is thought of as the best and much more likely to be consider to be the top score. The same in Germany too, which stems from their school making system where work is scored on a 1 to 6 scale, 1 being the highest mark. This is why it’s critical to understand the variance in acquiescence bias in each country before designing your questionnaire. Otherwise, it could dramatically impact your data.
Comprehension issues
It can take different countries longer to read and answer a survey. Slower speeds can be caused by linguistic repertoire, the use of characters, a more verbose language style or answering in a language that isn’t native.
Attitudes towards taking surveys
Unfortunately, we sometimes must contend with the approach to taking surveys. In the UK and US, people are less fastidious than other countries and can straight-line through a survey. Meanwhile, in China and India, there is a higher tendency to overclaim, sometimes intentionally but sometimes due to cultural differences.
How do you design surveys that deliver more consistent data across global markets?
So, what are some of the solutions to ensuring you achieve the most robust, reliable data from your cross-cultural research? Here are three tactics Kantar recommends for reliable data collection:
1. Weighting tables to calibrate basic differences
Calibration data can be used to harmonize differences across countries and respondent levels by having the main cultural markers mapped out. This can create a means of recognizing differences that can have a measurable impact in interpreting and comparing your data. Be careful when applying weighting techniques, however. It varies from question to question and requires enough data to pull apart the subtleties.
2. Adopt a more visual approach to asking questions
Visuals and icons can help introduce more consistency than words. Using imagery not only improves attention, it can also reduce the intra-country variances. This is because people interpret pictures more consistently than language nuances. For example, we’ve found that using faces rather than words can bring the variance between countries down by almost a third.
Visuals can also increase engagement and the attentiveness of respondents more generally. By bringing the question to life and making respondents concentrate more, you will yield more consistent data.
3. Change how you ask questions
Cultural bias can be considerably reduced if you look at asking questions in different ways. For example, you can overcome agreement scale problems by replacing rating scales and rankings. Consider using tools like max. diff and custom scale anchor points instead. You may find they yield a better spread of answers and understanding of respondents.
Learn more
Keen to learn more? Access all our recommendations in our Online Survey Design Module: Factors to Consider When Conducting Cross-Cultural Research.
If you like these research tips and want to see more, use the form below subscribe to receive a monthly research tip on online survey design, sampling, data integration and more – in addition to updates when new Online Survey Training Modules are released.