Validity

The validity of the answers is a major concern in surveybased research, particularly in surveys of sensitive behaviours such as substance use. In ESPAD terms, validity could be said to be the degree to which the survey (including its methods of data collection) measures those aspects of students’ consumption of different substances that we intend to measure. In the 2011 ESPAD report (Hibell et al., 2012) the validity of the ESPAD survey was thoroughly discussed and the conclusion was, based on relevant available research, that the validity can be considered high in (ESPAD-like) school surveys. One factor that was pointed out as particularly important was that the students trusted that their responses were anonymous when filling out the questionnaire. Below is a number of topics important for the validity presented in relation to the 2015 data collection.

Translation of the questionnaire

The comparability of the actual questionnaire across countries is of vital importance in any multinational survey project. Establishing the equivalence of the translations of questions into the various languages is therefore an important aspect of establishing validity. The ESPAD master questionnaire is presented in English. In non-English-speaking countries, the questionnaire should be translated into the local language(s) and then back-translated into English by another translator, whereupon the original version and the back-translated version are to be compared for anomalies.

However, the equivalence of questionnaires is not only a matter of literal translation equivalence. It is also a matter of equivalence of understanding, meaning that each question should be understood in the same way in all countries, irrespective of the original wording in the master questionnaire. When necessary, the questions have been culturally adjusted to suit the situation in individual countries. For instance, the slang words for different substances asked for in the questionnaire should be adjusted to the situation in each single country. If this is not done properly, comparability with other countries may be undermined.

No major problems with the translations have been reported or detected. On the whole, it seems reasonable to assume that the translation of the questionnaire was not a major methodological problem and does not jeopardise the comparability of the results between the ESPAD countries.

Student cooperation

The primary prerequisites for obtaining any data at all are that students in selected classes actually receive the questionnaire and that they are willing to fill it in. The first prerequisite is not met if the school or the teacher refuses to cooperate. If students do receive the questionnaire, they must have enough time to complete it, they must understand the questions and they must be willing to answer the questions honestly.

Participation in the study, of course, was voluntary. However, in nearly all countries no or very few students were reported to have declined taking part (Table C). On average, 0.5 % (0.0-1.5 %) of the students present in the classrooms refused to take part in the survey. In Ireland, Latvia, Portugal, Romania and Sweden these rates were above 1 %.

Some form of parental consent was required in roughly three quarters of the countries. In three of them, active parental consent was required. According to Table C, in countries where only passive consent was requested 0.5 % (0.0- 1.7 %) of the students could not take part in the study due to parental refusal. In the three countries where active consent was needed, refusal rates were higher: Georgia 2.0 %, Portugal 6.0 %, Romania 6.9 %. Hence, parental collaboration can be deemed lower in the latter two countries.

Since the reasons for parental refusal are not known, it is unknown whether this is linked to the subject of the survey, i.e. substance use. However, even though uncertainty in this context would be greater when the proportion of students who were not given permission was larger, it seems a reasonable assumption that the topic of the survey was not in most cases the main reason why parents denied their children permission to participate. Hence, parents refusing to allow their children to participate in the ESPAD study are not seen as an important methodological problem that influences comparisons between countries to any important degree. However, in the countries with the highest figures, it includes some measure of uncertainty.

As described before, all data were centrally cleaned in a standardised way. With few exceptions, only a relatively small fraction of the questionnaires were discarded during the cleaning process. On average, 1.8 % of the questionnaires were excluded (Table C). Some countries clearly displayed greater proportions, including Cyprus (3.8 %), Austria (4.2 %), Norway (4.2 %) and Latvia (7.6 %). This may be an indication of a situation that is not as good regarding student cooperation in these particular countries, although technical issues also may have contributed to these levels, at least for Latvia. However, overall, the proportions of discarded questionnaires do not indicate any significant problems relating to student cooperation.

In a standardised classroom report, the survey leaders were asked (a) to report disturbances in the classroom during the data collection, (b) the extent to which the students had worked seriously and (c) whether the students seemed to have had difficulties understanding the questions. On average, 75 % of the survey leaders reported that there were no disturbances during data collection. In four countries (Greece, Moldova, Slovakia and Ukraine) these levels were lower (around 50 %). However, it should be noted that research assistants or survey leaders other than teachers were responsible for the data collection in all those countries from which disturbances were more frequently reported. They are likely less used to the normal level of disturbance in a classroom compared to teachers, and thus more likely to report disturbances.

In most of the countries, a majority of the survey leaders (64 %) reported that ‘all’ students worked seriously and an additional 34 % indicated that the majority had done so (Table J). On the other hand, 2 % of the survey leaders reported that less than the majority had been working seriously. These levels were somewhat higher (4-6 %) in Croatia, the Czech Republic, the former Yugoslav Republic of Macedonia, France, Latvia and Poland. Even though the proportions were low, this may be an indication of a possibly less-good setting than in the average ESPAD country.

a Proportion of survey leaders answering ‘rather difficult’ or ‘very difficult’.b Cyprus did not use the standard class room questionnaire but reports that there were ‘high ratings offered by supervisors with respect to the cooperation of the students’.c Official name former Yugoslav Republic of Macedonia.

In summary, no country reported problems with many students declining participation. The proportion of discarded questionnaires was low in nearly all countries, with an average of 1.8 %. When there were disturbances during data collection, they rarely involved more than a few students. Even when fairly high levels of disturbances were reported from some countries, they seem very rarely to have had a negative effect on student cooperation. In fact, most survey leaders reported that all/the majority of students worked seriously. In the case of countries with lower rates, those responsible for data collection were nonteachers who were most probably less familiar with the normal noise level in a classroom. Hence, student cooperation seems to have been good or very good in nearly all participating countries.

Even though overall student cooperation seems to have been satisfactory, two remarks need to be made in this respect. One is the fact that a fairly large number of questionnaires were removed from the Latvian database (7.6 %), even though technical issues also may have contributed. The other remark is that the circumstances regarding the datacollection situation remains unknown for Cyprus, since standardised classroom information has not been collected.

Student comprehension

All countries asked all or nearly all of the core questions from the ESPAD master questionnaire (Table K). A majority of the countries also included the module about risky cannabis consumption (the cannabis abuse screening test (CAST)) as well as several of the optional questions. Most countries also included at least some national questions.

The total number of questions in the national questionnaires varied across countries. The average number of items (with each subquestion of a question being counted as an item) was 293, the smallest number being 234 in Estonia and the largest being 416 in Ukraine (Table K). Naturally, the length of the questionnaire has an effect on the time taken to complete it. In addition, differences in students’ experience of participating in studies of this type may also affect the time for completion. For these and other reasons, it is not surprising that the time taken to respond to the questionnaire varied across countries.

The average response time was 38 minutes (Table K). The highest figure (53 minutes) was reported from Greece. A rather long average completion time was also reported in the Faroes and Moldova (52 minutes) and in the Czech Republic, Portugal and Ukraine (around 46 minutes). No country reported refusal by students to complete the questionnaire because of its length.

In a few countries, more than 10 % of the survey leaders thought that the students had had some difficulties responding to the questionnaire (average 5 %). The highest proportion was found for Belgium (Flanders) (22 %) (Table J). It should be noted that the Belgian figure also included information from classes in more junior grades, where very few students in the ESPAD target group were to be found. Presumably, the corresponding figure for the Belgian ESPAD target population only would be considerably lower. The levels were more than twice the average in the Czech Republic, Latvia and Lithuania (between 11 % and 15 %).

Overall, student comprehension seems to have been satisfactory in most participating countries. However, the longer the time needed to fill in the questionnaire, the greater the risk that some students may grow tired towards the end and start giving less reliable answers. Even though this might have happened in some countries, it should be kept in mind that the ESPAD core questions were at the beginning of the questionnaire and thus less affected by possible fatigue linked to the length of the questionnaire.

Anonymity

It is crucial in surveys about deviant behaviour, such as illicit drug use, that the respondents are confident that reporting such behaviour will not entail any negative consequences for them. It is therefore important that the students understand that the survey is anonymous. Several measures were taken to ensure perceived as well as actual anonymity. The ESPAD handbook recommends that an individual

envelope be distributed along with the questionnaire. This gives the students the possibility to seal the questionnaire right after completion. In 23 ESPAD countries, such individual envelopes were used (Table G). Countries that did not use individual envelopes used other methods to ensure that the students felt that their anonymity was safeguarded. These methods included individual stickers, a closed box or a joint envelope for the entire class, often sealed in front of the class before being sent off to the research institute. If the data collection was performed online, the data was stored on a central server, to which only the research team had access.

The survey leader could be either a teacher or a research assistant. The decision as to the most suitable survey leader was taken by each country. The basis for this decision should, of course, be that the person most trusted by the students is chosen. In about half of the ESPAD countries, teachers or other members of school staff functioned as survey leaders, while the other half chose research assistants or other people from outside the school (Table G). The survey leaders were asked to stress the issue of anonymity and to refrain from walking around in the classroom while the questionnaires were being completed. The students were instructed, verbally and in writing on the first page of the questionnaire, that they should not put their names on the questionnaire or the envelope.

No country reported any serious doubts among the students regarding anonymity issues. Overall, anonymity seems to have been handled satisfactorily in all participating countries.

a According to country report and not according to classroom data.b Maximum time allowed, not average time used.c Official name former Yugoslav Republic of Macedonia.

Data entry and rates of missing data

Twenty countries entered the data manually while 11 used optical scanning. In four countries no data entry process was necessary since online data collection was performed using a web-based questionnaire (Table G). All countries were encouraged to perform quality checks of entered data. No particular problems were reported due to such checks.

In the instructions given to the students it was stressed that it was important for them to answer each question as thoughtfully and frankly as possible. Since participation in the study was voluntary, however, students may have skipped questions they found objectionable. Rates of missing data on substance use questions may indicate the respondents’ willingness to report such use.

The proportion of unanswered questions was low for all substances (Table E). After data cleaning, the average proportion of non-responses on lifetime use ranged from 0.2 % (lifetime prevalence of ecstasy use) to 1.4 % (lifetime prevalence of alcohol use). There were no alarmingly high numbers of unanswered questions on lifetime substance use in any country. The highest rates were found for cigarettes and alcohol in Portugal (around 4 %). For illicit substances, the highest non-response rates found for any country were just above 1 %. Non-response to single (sensitive) questions is thereby not judged to be an important methodological problem in the ESPAD 2015 data collection.

Logical consistency

A measure closely related to the inconsistency measures discussed in the reliability section is logical consistency. In the ESPAD questionnaire this is relevant for sets of substance use questions measuring use during three time frames: lifetime, the last 12 months and the last 30 days. Logically, the figure for prevalence in the last 12 months cannot exceed lifetime prevalence, and the 30-day prevalence cannot exceed either the 12-month prevalence or the lifetime prevalence.

Table L includes information on the proportion of inconsistent answers relating to these three time frames for three variables: alcohol use, having been intoxicated and cannabis use. For ecstasy use and use of inhalants only lifetime and 12-month use are compared. In nearly all countries and for all five variables the reported proportions of inconsistent answers were relatively low. In other words, the proportion giving logically consistent answers across the three (or two) time frames can be considered sufficient.

Fairly high proportions of inconsistent answers were found in a few countries. To a large extent, they relate to alcohol use. Inconsistent answers on alcohol use were given by roughly 12 % of the students in Albania, Georgia and Cyprus. Across the five variables, Cyprus together with Bulgaria tended to display an overall less-favourable consistency, indicating somewhat lower data quality in relation to this aspect in these countries. With the exceptions mentioned, logical consistency seemed to be relatively high in the participating countries.

Under-reporting

One important methodological problem in surveys relates to social desirability, i.e. the tendency of respondents to give answers that they believe will show them in a good light in the eyes of others. This factor becomes particularly important in surveys relating to behaviours that are not accepted in a society or are even illegal.

At the end of the core part of the questionnaire used in the 2015 ESPAD survey, students were asked about their hypothetical willingness to admit cannabis use. The wording was, ‘If you had ever used marijuana or hashish, do you think that you would have said so in this questionnaire?’ The response options were ‘I already said that I have used it’, ‘Definitely yes’, ‘Probably yes’, ‘Probably not’ and ‘Definitely not’.

The proportions of students claiming that they would definitely not report cannabis use are shown in Table L. In two thirds of the countries, between 5 % and 10 % replied that they were definitely unwilling to admit cannabis consumption if they had used it. The highest figure, 24 %, was reported from the former Yugoslav Republic of Macedonia. In Albania, Croatia, Latvia, Lithuania, Montenegro and Romania the values ranged between 15 % and 20 %.

A higher proportion of students replying that they would not be willing to admit cannabis use might signal problems with validity, but this is not necessarily the case. In fact, students who have never used illicit drugs may tend to be rather strongly opposed to their use, and this opposition may in part be reflected in their answers to this hypothetical question. To the extent that the response to this question reflects the opinion of the population of non-cannabis users, the result will yield a toopessimistic view of the actual willingness of the cannabis-using population to report such use. It should also be borne in mind that the question is hypothetical. If a student tried cannabis in the future, he or she might be willing to admit it in a survey even if a negative answer had been given this time. Combining these two arguments gives rise to a third reflection: if, in the future, a student decides to try an illicit substance for the first time, the very reasons that caused him or her to try the drug might also entail a changed willingness to admit its use.

The question about hypothetical willingness to report cannabis use may be most useful in a cross-cultural context. In countries where a high proportion would definitely not admit such use, many adolescents apparently consider it so shameful that they could not even hypothetically imagine reporting it. The figures for the unwillingness to admit cannabis use were rather high in some countries and much lower in others, indicating that the level of under-reporting may vary across countries.

It can be concluded that surveys most probably underestimate the prevalence of illicit substance use, that under-reporting probably differs somewhat across countries and that under-reporting of illicit drug use might be higher in the seven countries mentioned above. There is, however, no reason to believe that such differences would undermine the overall conclusions of the study. Hence, low-prevalence countries would most likely remained low-prevalence countries even if all students who had taken illicit drugs admitted their use.

Table L. Some aspects of validity: inconsistent answers, unwillingness to admit cannabis use and reported use of the dummy drug ‘relevin’. Percentages. ESPAD 2015

a For each substance an inconsistent response pattern is defined as one in which any of the following is found: (a) 30-day frequency is higher than annual frequency; (b) 30-day frequency is higher than lifetime frequency; or (c) annual frequency is higher than lifetime frequency. For ecstasy and inhalants only lifetime and annual frequency.b Students answering ‘definitely not’ to the question ‘If you had ever used marijuana or hashish (cannabis), do you think that you would have said so in this questionnaire?’.c Instead of relevin some countries used national alternatives as a dummy drug.d Official name former Yugoslav Republic of Macedonia.

Over-reporting

In addition to the risk of under-reporting substance use, there is also the risk of respondents exaggerating their substance use experience, which may also threaten the validity of the results. To test this, the dummy drug ‘relevin’ was included among a list of existing substances in the questionnaire. Countries may use another name instead of relevin for the dummy drug, if there is a risk that the students may confuse it with a national street name for any existing substance.

The average across all ESPAD countries for reported relevin use was 0.7 % (Table L). In Bulgaria, Cyprus, Georgia and Poland however, the proportion of students reporting use of the dummy drug was higher than average (around 2 %). With the exception of those four countries, few students reported any use of the dummy drug, indicating that students do not routinely exaggerate their substance use. It seems reasonable to assume that high prevalence rates for drug use are in practice nearly unaffected by a possible general tendency to exaggerate drug use. However, these findings also underline the need for caution in interpreting the prevalence of less-common drugs such as heroin and LSD. For each country, the proportion reporting use of the dummy drug could serve as a baseline for plausibility — meaning that if, say, 0.7 % of students in a country claim to have used the dummy drug, then the first 0.7 % of students reporting use of a real drug should be interpreted with caution.