Planning of National Transport Infrastructure
In the Islamic Countries
42
In addition to the protection of its content, also the way how surveys are carried out can have
an impact on the final output of the collection of data. In fact, we all are aware of the possible
biases that could be generated due to the sample size and the sample population in which
surveys are conducted. It is also interesting to stress on the relevance that the collection tools
can have. As an example, sentences of a questionnaire must be clear, concise and terminology
must be plain and easy-to-understand. It has also to be kept in consideration that too long
sentences will divert the attention of the respondent and it could be the case to face a boredom
bias that will not reproduce reliable results.
Furthermore, it is necessary to consider carefully the willingness of the respondents to take part
in the surveys – mainly due to lack of interest or time - which allows to keep acceptable response
rates, especially for voluntary surveys. This is one of the most challenging issues in conducting
surveys due to the increasing difficulty in reaching respondents through digital campaigns (e.g.
number blocking), but also the very actual concern of respondents with privacy.
Canada’s Compendium (2016) reports how it is necessary making sure that the above-
mentioned criteria are respected. To do so, it recommends testing in advance the set of these
tools on “a representative sample of respondents (individuals, households or businesses).”
Statistics Canada is working towards the direction to maintain or improving participation rates
in surveys to help enhancing the quality of the results. The key-points through which Statistics
Canada will pursue this target is improving reminder and refusal strategies, increasing the
number of online questionnaires for data collection, implementing persuasive techniques and
interactive communication strategies with the respondent and strengthening the use of
administrative data to “support the survey process and reduce the response burden” (Canada,
2016).
For the survey to be effective, it is necessary to devote time to plan correctly and consistently
the investigation. Canada’s Compendium (2016) clearly states what the objectives for a
successful survey are, starting from valid tools for the collection that respect the requirements
reported above, a settled governance structure and a plan of awareness for the respondents in
a way that they feel involved in the survey and they are aware of their importance in the
program. Staff should also aim to maintain the relationship with respondents.
There are however some challenges that collection tools have to face. Pushing them through the
process of modernization to improve the efficiency of collection operations implies also costs
for National Statistic Offices, such as the adoption and implementation costs of the technology.
There are mainly solutions to face this challenge that are highlighted in the Compendium: the
first one is to adopt a single versatile collection tool, Integrated Collection and Operation
Systems Initiative (ICOS), that would help to achieve the highest grade of functionality and
employment of internet in e-questionnaires. The other possibility is implementing a
“coordinated approach to the acquisition and implementation of a new technology”. This
solution is embodied in the experience of Cape Verde who adopted personal digital assistants
(PDAs) in the 2010 General Population and Housing Census and the south-south co-operation
between Brazil, Cape Verde, Côte d'Ivoire and Senegal.
In that context, the National Institute of Statistics (INE) of Cape Verde used mobile devices in the
collection of data. This was a key-event as it was the first case in Africa of a digital census that
helped both in reducing costs and time for the collection itself and the publication of results, but