- Find out more about general users’ information behavior, using survey and observational methods.
- Establish the current state of play on the use of narrative, using desk research and interviews with museum personnel.
- Explore which are the most promising opportunities for incorporating narrative into user engagement, using open-ended and creative tasks in different systems (including PATHS and one or more social media sites), both individually and collaboratively.
- Evaluate the outcomes of the task-based activities giving consideration to qualitative measures such as satisfaction with the results, perceived outcomes, quality of the user experience, etc.
- Analyze the content of the narratives produced via task-based activities to understand more about the nature of narratives produced by general end users compared with those produced by experts.
Monday, 20 February 2012
Narrative as an element of communication in cultural heritage organisations (by Paula Goodale)
Friday, 10 February 2012
Surveying online survey tools (by Angharad Roberts)
Following on from Andrew's blogpost about online surveys, yesterday's session provided an excellent opportunity to discuss people's experiences of different online survey tools. I presented a document (which can be viewed below or here) describing a range of online tools, evaluated according to four criteria which seemed important to me: compliance, compatibility, clarity and cost.
Compliance relates in part to the important issue of data protection, raised at the end of last month's meeting. EU Data Protection laws say data shouldn't be transferred to countries outside the European Economic Area, unless the country it is transferred to has equivalent laws. US laws are not regarded as providing the same level of protection (9 countries which do are listed here) but it does have some of the biggest data storing servers, including Google and SurveyMonkey. There is something called the US-EU safe harbor framework, which enables US-based companies to show that their data protection procedures meet EU standards. Google complies with this, as do many big online survey companies. Another compliance issue is to do with the accessibility of the survey - for example, does it work with screen reader technology which may be used by people with visual impairments?
Compatibility relates to the options a survey tool provides for how data can be exported - can it be downloaded directly into SPSS, or would it be available as an Excel spreadsheet?
Clarity - for me, this mostly relates to question types and particularly so-called skip logic questions. I have a number of different potential target audiences and although I want to ask most of them the same questions, there are some questions I only want to ask one particular group. Skip logic allows for different pathways through the same survey, depending on answers to particular questions.
Cost - there lots of free versions of survey tools but these often have very limited functionality. I've set these out in the limitations column of my document. For example, SurveyMonkey just allows 10 questions and 100 responses in each free survey. Export options may be limited in some free tools as well. So I may need a paid for survey tool, which means it's helpful to know what the range of potential costs could be, including potential discounts for academic / research use (SurveyGizmo offers a free student account, but badges these surveys with a SurveyGizmo student research logo - which may not be the image I want to project). It also raises the question: are there survey tools in use within the department which it might be useful to know about?
This was followed by a very valuable group discussion about some of these issues. In response to a question from Alex Schauer, I clarified that most of these survey tools allowed for surveys in "all languages" or 40+ languages. I'm not sure whether these terms are used interchangably, with no survey tool appearing to list more than 59 supported languages - these figures seem to be related to the Unicode standards for supported language character sets. Bristol Online Surveys only supports 10 languages in addition to English (at extra cost); the free version of QuestionPro has no multi-language support (a point omitted from the version of the document I presented yesterday, but included in the copy linked to from this post). Barbara described problematic experiences with attempting to export data from survey tools, and suggested experimenting with the export process before choosing a tool to run the actual survey. Liz Chapman described the different approach to data protection displayed in one recent US survey ("we can't promise anything about your data...") and some of the limitations of SurveyMonkey, which may be addressed in one of the more expensive subscription versions. Mark Hall talked about his experiences of using Lime Survey, and the exciting prospect of an in-house survey tool, developed in the iSchool, which would potentially give iSchool researchers complete control of their survey data. Paula also described some of the more powerful features offered by QuestionPro.
You can view, download and print the summary document here or view this on Scribd:
Thursday, 9 February 2012
Surveying Universities: A Modest Proposal
The background
I am drawing near to the end of a project to look at the information behaviour of students at various stages of education; beginning with Key Stage 3 (11 to 14 year olds), and going all the way up to postgraduates.
A key part of the project involved surveying universities in the Midlands and the North of England. Between us, my colleague (Mary Crowder) and I approached 12 universities with a view to asking them to circulate our survey amongst students and staff. Responses to our request varied. All too often however, we got one of two answers.
1) We were told that there was nobody in particular responsible for posting surveys and that we could try Computing Services, Students' Union, Marketing, Student Administration, or various local equivalents; or
2) They already had numerous questionnaires generated by their own staff and students and were concerned that people would get survey fatigue.
In the end, we got responses from students and staff at five universities. As an inducement, we offered to enter respondents into a prize draw, with the opportunity to win a £50 Amazon voucher. At each university, two vouchers were offered to students, and one to lecturers. The project therefore paid £750 in prizes. Since the response rate at one university was very low, some students and staff had an extremely high chance of winning.
The modest proposal
I presume that we are not alone in wishing to learn about the views of university staff and students across the UK. We are also not the only project to offer the inducement of a prize draw. I suggest therefore, that UK research councils with an interest in educational research should consider setting up a central site on which RCUK-funded researchers can post surveys. Completion of a survey would qualify a student to enter a draw with prizes provided from Research Council funds. To enter the site, it would be necessary to log on with an .ac.uk email address.
So far (according to the blogger statistics) this blog has been viewed 3000 times by readers in 16 countries. If anyone has knowledge of such a scheme within their country, or can suggest ways to elicit opinions of students and staff across Higher Education, I would very much appreciate receiving their comments on this subject.