As part of our work on De Facto, the partners carried out a comprehensive survey which collected data on practices and perceptions with regard to using and working with information in teaching and learning settings, with a focus on learning assignments.
The survey was split in two parts in accordance with the two principal target groups - teachers & educators and learners. The survey was designed to provide insight into the attitudes and practices of teachers in secondary education, higher education and adult/continuing education. Mirroring these levels was the survey for learners.
We collected responsed in 7 languages in the 7 partner countries: Belgium, Bulgaria, Czechia, Italy, Poland, Slovenia and the United Kingdom. There was no quota set for each country and there was also a very small number of respondents from other countries. Our objective is to get a snapshot of our field of study and we recognised from the outset that there are significant differences in training and learning contexts within a country and between countries, the practices of teachers at these levels and even within a single training institution, and the individual learners' attitudes practices. We were, therefore, attempting to unveil parts of the whole picture. This, among other things, means that the results should be considered and interpreted in conditional terms. We were surprised to find out that there were no statistically significant differences between the individual countries, on any question, which allows us to work with the aggregate dataset from all respondents. Some questions were designed as mirrors in both parts of the survey - in these cases we were particularly interested in how the actors in teaching-learning process approach or interpret specific tasks.
With variations between the different questions based on the number of responses and possible condition-logic for some answer sets, we are working with samples of just under 800 for the teachers' part of the survey and just above 2000 for the learners' part. The survey was active in collecting responses between January 2019 and December 2019, with the bulk of reponses concentrated between February and June 2019.
The purpose of this report is to discuss the main findings and conclusions from the survey. It is not rigid in following the survey structure, it is rather a pragmatic demonstration of key insights, as the analysts understand them. The report is intended for teacher and educators and will hopefully improve the understanding of how both sides of the teaching-learning equation approach the issues related to information source and veracity.
We thank the respondents, and the many institutional and professional partners who helped us reach a broad respondent base.
In two questions we aksed learners what are the things they like (left) and dislike (right) with regard to collaboration with their peers when working on learning assignments. The respondents were asked to choose from a list of suggestions, and they were allowed to choose more than one option.
In an interesting observation made by one of our analysts, when we put the two sets of answers next to each other, it seems that the respondents were more comfortable in choosing positive answers than negative. This may be a somewhat long-shot approximation to the phenomenon which we study and explain with the equivalency and emphasis frames (one of De Facto's 4 pillars) where we demonstrate that choices framed in positive terms are more likely to be preferred to the negative, with the underlying mathematical equivalence of choice options is maintained.
Note that there is no colour matching or correspondence between left/righ graph.
Most of the options (in graph A) are chosen by between a quarter and one-third of respondents. The dominant option chosen by nearly 40% of respondents seems to be a genuine appreciation of the benefits of teamwork in general. Overall, respondents report as slightly less interesting the more 'specific' responded options, such as deepening of knowledge with the specific purpose to make better choices for own education, building of transversal skills and the positive effect of the process of searching for information.
Most answer options take the respondents to a more abstract level of thinking, where an active projection of the self is required. The very fact that most respondents went for many answers instead of focusing on one or two means that we can be reasonably confident that learners understand the complex role of assignments in the learning process. They seem to understand that the impact is beyond the immediate purpose of the subject they are studying.
We can interpret this in two ways - that the teachers are effective in communicating the reason they give out assignments, or that the learners are well experienced with assignments already and can make an informed judgement on their benefits. Whether one, the other, or a combination of the two, this is a positive finding. It means that assignments are recognised by learners as a meaningful learning activity.
A striking number of respondents (in graph B), 57%, make a fairly categoric statement about the single most important factor for disliking learning assignments. This probably has two sets of factors behind - the time pressure of the assignment deadline itself, and the time pressure of the collection of tasks and activities which need to be done, in team, for the assignment to be complete. In a sense, we have a nested structure where the assignment deadline competes with a number of other duties and obligations at individual level, and then a sub-set of additional pressures arising from the collaborative process itself (e.g. internal milestones, waiting for another learner to do their part so you can proceed with yours, etc.).
It would be interesting, and with a clear research value, to look into this deeper in future studies, so that appropriate measures and strategies can be suggested to the learners depending on which time-pressure factor is leading, and which are secondary. Also, managing time and stress is an important skill set not only for learning, but also for the work life that follows. An early feedback from the school system on this would allow for identification of genuine problems and related needs. An overwhelming feeling of time pressure and anxiety may as well be indicative of a need to build time-management skills, for example.
The above discussion feeds directly into another option selected by a fifth of learners, who declare that they dislike the need to work in teams. Again, we are not in position to evaluate the level of discomfort or anxiety that comes with this, and there are at least several hypotheses which could explain why some people may dislike group work and team. Individual preferences must be respected and we would not want to mould everyone into a team player - but teachers would benefit from knowing and understanding whether this is the case, as opposed to whether it is simply lack of skills in the presence of desire to work in team. A fifth is not a number which can be neglected, so to be pragmatic, we would advise teachers to be prepared with assignment formats which include more opportunities for engagement for people who have reasons to be less enthusiastic about teamwork.
We apply - and would advise to teachers - a similar logic when referring to another reason for disliking assignments, the need to present in front of others (18% of respondents).
Two answers do give rise to some concern - аssessing learning assignments as 'too much work compared to the importance of the assignment' (18%) and as 'too academic/not practical' (21%). These reveal the expectation for assignments to be practice-oriented, related to the real world and for the invested efforts to be balanced with the outcomes. Hand-in-hand with these two goes a dislike of assignments bearing 'little relevance to the real world' (15%). We would suggest that teachers routinely review their assignments and adjust/update them to bring them in line with such natural expectations.
At the bottom of our chart of dislikes, almost as a sign of acquiescence of their inevitability, stand the lack of love for 'the need to look for resources' (11%) and 'too much work in identifying and scrolling through long lists of sources' (14%). While these are certainly not to be neglected, it seems these are inevitable in any type of work that involves serious use of external information sources. Still, perhaps teachers could think of, and suggest, methods of work that simplify or facilitate such work.
This graph depicts the importance – as self-reported by respondents – which learners in different types of schools attribute to the verification of sources that they use in their learning assignments.
Though our dataset tends to be biased in the direction of secondary education where the largest part of our respondents come from, the data clearly shows that the trend towards realising the need to verify one’s sources grows in parallel to the educational level, to become most pronounced among university and adult learners.
This may result from the fact that university students do much of the learning on their own, constantly busy with various assignments. Some of these assignments, with extended and expanded content requirements, have a decisive impact on their academic results and advancement, including their qualifying for a degree. Thesis papers are, in their nature, learning assignments, and we are treating them as such in this study.
This opens a question for the analysts: is this only attributable to the heavier emphasis on source verification in higher education, justifiable by the more 'serious' level of study, or does it also show an inherent weakness at the lower, preceding levels of education?
After all, it could be argued, within reasonable limits, that the use of verified and reliable sources of information should be equally important to any educational activity and regardless of learners’ age. While at the level of higher education it may be bridging to the 'purity' of a future academic research, at earlier stages it is critical to forming attitudes and building information literacy skills. It seems that, at least in declarative contcext, we do end up with a more solid recognition of this fact – the prevailing answers at the university level are 'important' and 'very important' (76% combined). Equally important, advancing through the educational levels clearly has the potential to reduce and almost entirely remove the hesitation (don't know).
But while there is a clear majority here, in percentage terms it is still a worrying result that we get this uncertain or laissez-faire style attitude with regard to possible disinformation, and it remains a topic that university teachers might want to stop take for granted, and should bring up for discussion at appropriate times.
Unless educators engage actively, the respondents who fail to present an opinion here, and those who think it is not important at all, will always represent a group that is under increased threat of disinformation and manipulation. Secondary education learners seem particularly vulnerable.
We asked the educators to what extent the use of resources by their learners corresponds to their expectations. We provided a classic 4-point scale split in the middle that gives a positive and a negative range.
The doughnut diagram shows a clear prevalence of the positive range. There might be a variety of interpretations of this data, starting with the inevitable flexibility of educators who adjust to realities and are ready to promote any effort made in the name of learning.
We advise caution in interpreting this in the sense that 'all work here is done', as we have no objective measurement here before and after, and we are relying on self-reporting. Arguably, this is a case where self-reporting tends to lead to rationalising. We would suggest to schools and individual educators to develop and implement a simple system which sets a baseline standard and then measures performance according to that benchmark. An example to follow and to model this system could be the models for plagiarism detection.
We asked the learners in the survey about their practice in selecting and deciding which sources to use for a particular assignment.
This question is one of the keys to unlocking De Facto's potential. The diagram demonstrates a relatively high percentage of learners (46%) who follow their teachers’ suggested lists of sources. This may be out of respect, or due to a lack of initiative or motivation to extend the search. A similar-sized slice of respondents say that they prefer a list of their own. Some 38% respond that they search for sources at each occasion, which we identify as a good practice which leads to improved information literacy, less bias and better quality of information (higher relevance). This model should be actively promoted.
It would be interesting to know how this ratio (currently 1:1) will change over a period of time, with improved information literacy on both sides. We can consider as somewhat alarming and potentially dangerous the use of sources such as 'family members' and 'other social circles'. Such sources may (although not as a rule) very often be contaminated with mis- and disinformation and should have their role in educational context under monitoring.
It is worth the effort of educators for this to be properly explained and discussed with learners in order to help them build and develop information literacy skills. The results of our survey clearly show the need for such a teaching/training intervention.
Teacher training is predominantly in the realm of formal education. Only 51 respondents (out of 442 in total, and coming from all educational levels) answer that they have not undergone formal teacher training. Yet at systemic level, this is not such an uncommon practice, especially with regard to some upper-secondary and university-level teachers. E.g. in some coutries it is not at all commonplace to have teachers qualified in pedagogical studies in the HE system, they are rather the exception.
What we would be doing next, after collecting a larger sample of teachers who have not had a formal pedagogical training, will be to look into the possible differences in their practices. Such investigation can also be carried, much easier perhaps, at the individual institution level. A word of caution - we are not implying that teachers without formal teaching qualification have inferior practices, rather we say that we are expecting to see different practices, methods and approaches.
We wanted to investigate whether teachers' practices include formal evaluation criteria for assignments which include (A) the choice and quality of the information sources used by the learner, and (B) the accuracy of the information used in such way.
The teachers seem to assess learners' assignments in a manner consistent with evaluation of both choice & quality of information sources (66%), and accuracy of the used/referred information (73%). This means that we are left with between one-third and one-quarter of teachers who disregards such criterias, which is still a large percentage.
While in some cases this can be justified by the specific learning outcomes which a teacher is seeking to achieve by using an assignment, it is certainly an area of improvement. We think that teachers might be avoiding such checks for several reasons: lack of time to properly perform such checks; lack of understanding of their importance; or lack of knowledge and skills how to perform them. All of this can be addressed by the teachers themselves, or by an appropriate support structure in their schools or training institutions.
We would recommend that future studies focus on this matter and investigate further. For all practical purposes concerning De Facto, we see a large area for potential improvement here.
A bit over 70% of the surveyed teachers confirm that they verify the sources of information after the assignment is submitted to them. This sounds very optimistic, as it shows that a significant part of educators do consider sources to be of great importance.
Nevertheless, we still do not know what do they mean by 'verify' – is it just a check of compliance with the suggested list, or does it also include verification of sources selected by the learners themselves? How about checking different types of sources, drawing on the online abundance of today? It’s worth the effort to further explore this aspect as to see what could be the possible implications of applying De Facto tool kit to source-checking in educational environment.
This is a cool interactive visualisation that allows you to plot and cross-reference several different answer sets in one graph. Try it!
This visualisation is somewhat difficult to interpret, but we insisted in including it in the report because we believe there is an interesting area for furter exploration here.
Truly meaningful results here could be obtained by doing a survey which covers educators from all science areas, and is properly stratified.
Unfortunately for is, we worked with a limited selection of educational fields which cannot be balanced and representative due to the peculiarities of the survey. As were focused on assigments, there is a extra layer of difficulty here, which can be addressed by researchers after us, in that assignments in different subject/science areas are not evenly dependent on the veracity of information pertaining to them.
Yet it would be very interesting to see how our existing education system has brought up teachers in different subject areas with a different understanding of the nature of information sources. And, and even deeper exploration would reveal whether these differences are 'native' to teachers because of their focus on a specific scientific field, or whether they are no different from what we would observe in the general population.
We asked educators whether they recommend sources when they give a learning assignment. The graph below summarises the types of sources used by those who gave a postiive answer.
Books (57%), scientific publications (52%) and educational videos (43%) are by far the most preferred sources being recommended. It is encouraging the leading trio is a group consisting of a classical-education format, a new-technology/new-media format, and a storng link to science.
This is somewhat proved by the second (in terms of votes by respondents) group of sources: dictionaries, encyclopaedias, and quite surprisingly in their company, shared platforms of user generated content. Blogs (9%) and oral sources (8%) may seem alarmingly popular, if we again consider the likelihood that they disseminate opinions rather than facts. But let’s consider them safe on condition that they may be relevant to certain contexts, such as to broaden the information and present alternative viewpoints which could later be subjected to learning activities related to critical thinking, fact-checking and source-checking.
Although there is a clear group in the lead, it is a good thing that we see a variety of source types being used. This has been shown as factor for increased engagement and motivation. We would therefore advise teachers that they think about a variety of sources. With regard to mis- and disinformation, the diversity of topics leaves the door open to discussions on the different methods of misleading and manipulation attempts that different source types employ.
This is one of our 'mirror questions' from the survey. At several key points, we asked the same question to educators and learners. With this particular instance, we were looking into possible gaps between the two roles which could potentially impact the work with information on learning assignments.
The comparison between educators and learners along this line is quite challenging, and yet intriguing. There is hardly any surprise that these are educators who still prefer books (be it in print or digital). Nonetheless, the two groups share almost the same interest for video platforms and online magazines. This could be an example of a 'normalisation' of videos as a mainstream educational medium and resource, as educational videos have had several decades of history of use in schools already, with the technology shifting only recently to allowing more rapid and easy creation and distribution than we used to have with video tapes and video-players attached to TVs in classrooms well before the internet era.
Respondents seem unanimous in their least preferred source, which are the podcast libraries (probably because of their comparative recency they haven’t gained enough popularity yet). Learners are more interested in online news sites (43%) than in print media (11%), whereas educators are still split between the two (26% and 24% respectively). These are significant differences.
Our advice to teachers would be to keep an eye on the preferred sources of information of the specific groups of learners they work with, and make sure that the learning assignments (and indeed all learning activities) include a good number of sources that learners find engaging, motivating and feel confident in working with them. This may mean that the teachers will have to step outside of their own comfort zone of established practices and sources, and that they could also use this as an opportunity to observe how different sources of information are used as platforms for attempted disinformation and manipulation. Such observations would be a valuable part of any targeted training intervention in the domain of information literacy (digital literacy & media literacy).
These comparisons really matter as they bring forward an important takeaway: whatever intervention in educational context is planned to increase information literacy, both groups – educators and learners – are to be taken equally into consideration. They are two connected gears of the same engine, the smooth running of which depends on their effective teamwork.
This is one of the key questions in our survey. We needed to establish the existing pratices with regard to verifying the sources of information. As we planned the survey, we developed a list of methods which we based on expert opinion. The respondents could choose as many methods as they want.
Note that the graphs is broken in two rows for the sake legibility. There is no other intent or purpose in this representation and order of data.
Cross-checking appears to be the in the lead amond all featured methods, except for secondary vocational learners, where it is second to comparing information to previous knowledge. This last method is also among the three most popular, the third being use of fact-checking websites.
Referring to co-learners, peers or colleagues follows in the ranking, though it remains preferred for less thant 10% of respondents, with the exception of the adult vocational group (20%). That may be linked to age-dependent factos such as the expected accumulation of experience and expertise, but this same logic fails to materialise in other adult education (non-vocational). A possible explanation might be that in adult vocational education there is a more clear expertise profile related to the specific field of study, whereas in adult general learning there is no major expectation than other adults are, by necessity, more knolwedgeable than you.
Resorting to family members for advice fades as learners pick up on age. It is almost extinct at university level and makes a small rebound in adult learning, where relevant expertise may once again be held by another family member. If this is reported correctly, we believe that as a channel for disinformation, this one has a low damage potential.
Using online expert platforms to place a question as a verification method has its followers, but they top at 8% and are steady at 5% for all learners except adults. Based on our expertise and the quality expert infrastructure behind some of these websites and services, we would encourage educators to explore the matter and discuss with their learners how to make best use of the remarkable expert potential there.
The analysts were caught by surprise by the high results for fact-checking websites, since those websites typically cover very limited number of fields, and concetrate on a small number of specific statements within those fields. The idea that they are used by a one-fifth to one-quarter of the learners is indeed difficult to understand, as it is highly unlikely that the type of inquiries or questions learners may have for their studies fall within fact-checkers' focus areas. Moreover, the number of fact-checkers in national languages other than English is fairly small, even when we consider large language communities (e.g. Spanish, French, Portuguese, etc.)
Another word of caution by the experts reviewing the survey results. This survey, as others before it, confirms that a large number of people (and learners) work with information by comparing new facts and statements with what they already know. This is a quick and practical method that yields results without any spent effort. The real danger here is that, as we know from De Facto work on frames and framing, and the neuroscience/cognitive science behind this, our brain will amost certainly reject and dismiss information which does not conform or comply with our established beliefs or knowledge, or will further reinforce existing beliefs in the opposite scenario. This means that if we have wrong knowledge already and receive correct and truthful information, this method will dismiss and discard the truthful information to retain the validity of the existing frame. We would advise in strongest possible terms that teachers work with their learners and explain the nature of these cognitive processes and the danger in overreliance on this method for verification.
Our working hypothesis is that some respondents were not fully aware of what a fact-checker is (and we did not provide definitions or examples in our question), and they considered it in a broader, common-sense aspect, as any site where they can go and verify a fact or a stamement by searching and looking through the information. This hypothesis needs to be checked, as it would provide a very valuable insight. Our recommendation to teachers and educators is to investigate this, perhaps with another (formal or more improvised) survey. It would provide us with knowledge that either fact-checkers are a tool with a popularity beyond expectations, or that they are not known and recognisable at all (or at most). This will, in turn, allow us to either turn to fact-checkers and work with them to create a much wider support network for education, or give the teachers the opportunity to present and explain the nature of fact-checking websites and services.
We asked educators whether they specify a minimum number of sources which their learners must use in order to complete an assignment.
We are aware that assignments vary in type and properties among different subjects and levels of education. With this straightforward question, we wanted to see a snapshot of what may be a prevalent practice. Expertise suggests that in order to build, develop and sustain information literacy skills, it is important to learn how to use and work with a variety of information sources and types. One way of ensuring that this aspect of an assignment is not overlooked is to instruct learners that they must use several different sources and information types. That would be a good first step and would lay grounds to addressing, in appropriate time, the issue of mis- and disinformation, and of the many ways information can use to travel from one place to another, or, indeed, from one mind to another.
We already know a bit about how teachers approach the issue of the number of information sources when it comes to assignment. Next we asked them how far they go down this path and whether they specify the sources to be used, i.e. point the learners to an exact location or resource.
As many as 39% of the respondents provide the sources themselves. This may be highly relevant and appropriate at earlier stages of education, when learners have not yet developed sufficiently the skills to search, select, review, and use sources independently.
Another 68% report that they work with recommended, but not mandatory, sources. Of course, this dichotomy could probably be attributed to the inherent personal style of teaching or training, but it can also be a conscious teaching strategy, where learners have the freedom to look around and make a choice themselves, and a 'support net' is being deployed for those who might be experiencing problems with this type of activity.
There is also a small, but significant (7%) proportion of teachers who are engaged in blacklisting sources. This is an exceptionally important area which begs for further investigation. It may be a result of competent teachers which themselves exhibit very high levels of information literacy, and have clearly established that a particular information source is to be avoided at all costs. However, similar reasoning could be used by teachers who are biases themselves towards or against specific information sources, without a good reason or justification, in which case they would be passing that bad judgement to their learners.
Rather than work with whitelists and blacklists, we would suggest that teachers engage actively in pursuit of information literacy for themselves and for their learners. Blacklisting is the easiest way of approaching the issue of disinformation. But, as quick as it may be, it is a flawed process precisely because of the bias involved. At the end of the day, good information literacy and decision on whether to use a specific source, fact, or statement, should remain with the individual, and this should be valid for every case. Good quality sources may occasionally slip and make a mistake, and disreputed sources may also come up with important, true and verifiable information. Our advice to educators would be to avoid the easiest path, as it has the potential to backfire – for them and for their learners – in the long run, as they advance along the education levels and in the professional life.
Here we were interested in the learners perception of the level of difficulty in using the teeacher-recommended sources. Though it is a subjective assessment, it is nevertheless important to know what might be the perceptions of learners on that topic, and to see if it varies in the diferent education levels.
The perceived difficulty of the sources might influence the attitude towards a particular assignment, but it can also have a cumulative effect in shaping the attitude towards learning assignments in general.
We used a breakdown by education level to avoid the bias towards secondary education. This reveals a clear trend in perceived difficulty from general secondary education (11%) to secondary vocational (17%), to adult (20%) and university (26%).
One might argue that this is a natural progression reflecting the increased complexity of the studies in these levels. With the exception of university learners (23%), all other groups report similar results for 'easy'. Neutrality is more pronounced for secondary general and university learners with 45-47%, and slightly less so for secondary vocational and adult (38-41%).
However, the 'difficult' 11% (lowest number) for general secondary may be a reason for further investigation. The ratio between difficult/easy responses is 1/1.5 for adult, almost 1/1 for university, 1/2 for secondary vocational. For secondary general the ratio stands at 1/3. In other words, learners in this group are facing significantly less difficult sources than the rest. Teachers migh want to investigate and test whether increasing the difficulty levels of some of their sources may bring a balance and provide challenge for more advanced learners. And this advice can be extended to all teachers as a good practice - make sure that some of the sources you provide are adequate to less-well performing learners, and some provide challenge for those who are up to it. This would translate into higher engagement and motivation levels. Not quite an outlier, the 7% 'very easy' responses for secondary vocational learners may also be considered along the motivation axis.
At the level of secondary education most assignments take up to 2 hours for learners to complete. Going up the educational scale the workload increases, quite expectedly, to reach the maximum for university students.
The amount of time spent on an assignment is, of course, a function of the quality and number of sources as well as the verification process (among other things).
While, naturally, the complexity of an assigment varies greatly accross levels, schools and individual teaching practices, it is important to have a benchmark, and to frequently consult with peer educators in order to make sure that the cumulative workload for the learners is not exceedingly large. Otherwise we may risk losing learners' motivation to perform and deliver, and to approach learning assignments in formal and mechanistic manner.
This graph shows the age distribution of our respondents. As mentioned elsewhere, this reflects the relative 'dominance' of the secondary education students in our dataset. Therefore we stress once again that it is important to interpret results in the context of the respondents' level of education, and this is why we provide, wherever possible cross-sections which make such distinction clear.
Our results from both learners' and educators' datasets are almost indentical. This consensus seems to demonstrate that assignments are a widely used teaching tool for educators and a frequent learning activity for most learners. To substantiate our conclusion, it is important to stress that the surveys were not run in tight-knit communities where teachers fil a survey and ask their learners to do the same. In fact, most of the time, as far as we know, there was no link whatsoever between the teacher and the learner respondents.
It was not our purpose to study the precise frequency, of compare this to the frequency of use of other teaching tools or learning activities. Our objective was to assess how commonplace are learning assignments in our education and training systems, as they are at the moment.
Based on these results, it seems that learning assignments are an appropriate tool to be used as a principal vehicle in introducing the novel De Facto approach for raising learners' information literacy and building practical skills for identifying mis- and disinformation. It seems, from our work on the survey and elsewhere, that educators can also benefit from such structured interventions. This means that the overall benefit can be felt by both sides in the teaching-learning process, an that the cumulative effect can indeed bring about a strong impact denying entry and continuing existence of disinformation – in education itself and beyond, in our social fabric.