Thursday, 21 November 2013

(Re)opening a discussion about our methodological landscape (by Mary Anne Kennan, Charles Sturt University)


It is uncommon for published research to articulate the meta-theoretical assumptions which underpin different paradigms/approaches/traditions of research. Instead, methodological discussion tends to focus on research methods, techniques and tools. As researchers however, our methodological assumptions determine 
  • our choice of research paradigm; 
  • the formulation of, and the way in which we consider and address our research questions; 
  • and also our selection of methods and techniques.

In this discussion we will consider the pros and cons of adopting a broader view of research methodology.  This includes being specific about meta-theoretical assumptions and their importance for achieving greater reflexive awareness of the “unconscious metaphysics” that underlie and influence how we see and research the world.  To do this we need to make a clear distinction between the concept of methodology as an overall logic of inquiry, and research method as a much narrower concept that defines processes, procedures and techniques that can be used to conduct empirical studies and collect and analyse data (Cecez-Kecmanovic, 2011).  To achieve this clear distinction, it is necessary to make explicit the assumptions and paradigms on which different methodologies are founded.

Methodological issues are discussed therefore, within a broader landscape which includes more than just research methods and techniques, but also addresses what lies behind them.  By discussing the issues in this context, we can ask questions such as:

How does our view of the world and meta-theoretical foundation influence the type of questions we will ask (and answers we will find) as researchers?

How does our world view open (and/or limit) the potential methodological paths that we will choose from?

How does being methodologically explicit help us to select and justify the research methods we choose to answer our questions?

References

Cecez-Kecmanovic, D. & Kennan, M.A. (2013) Chapter 5. The methodological landscape: Information systems and knowledge management in Research Methods: Information, Systems, and Contexts Williamson, K. & Johanson, G. (eds.). Tilde Publishing, Prahran, Victoria. pp113-138 ISBN 978-0-7346-1148-2

Tuesday, 8 October 2013

Giving an oral presentation (by J.E.A. Wallace)

All researchers need to learn to communicate details of their projects to a wider audience.  Obviously, different audiences require different approaches but to try to give some perspective, here are a few hints and tips gleaned from the experiences I’ve had with my own project thus far.

Firstly, and perhaps most importantly: know your audience.   This is not so much about the tone (although over-familiarity is a really bad idea in a formal setting!), but rather about the content of your talk. When presenting to your own group, there is perhaps a level of common knowledge that means that you can assume the basics and only give abbreviated reports on progress.   However, in any other context, it is best to assume that the audience has little knowledge of the specifics of your work. To that end, a brief introduction is useful to set the scene before getting into the details.

Secondly, pacing and timing are key.  A talk divided into clear sections holds the interest of an audience better than a long talk without breaks. It also makes it easier to rehearse timings and to be flexible.   Both are useful in preparing to talk in a formal setting such as a conference.  In such a setting, the Chair will offer some sort of signal near to the end of your slot, or a clock will be made available.  In either case, rehearsing timings beforehand is useful.  Do bear in mind though, that people tend to talk more quickly when nervous.
On the subject of nerves, remember that the audience is there to hear what you have to say. Everyone in the room is on your side so if there are any minor slip-ups, focus on your work and just keep going.  No-one will object if you need to take a moment to gather your thoughts mid-talk.

Lastly, don’t dread question sessions – they are usually a great opportunity to engage with the audience and possibly to establish some beneficial networking connections. From personal experience, these sessions have led to many worthwhile discussions after the fact, and to one worthwhile collaboration.

If anyone has any other experiences or additional suggestions  for presentation, please feel free to contribute in the comments.

Monday, 7 October 2013

Back to basics

Following last month's discussion, James Wallace made the useful observation that, since the bulk of people attending the researchers' group are PhD students, it would be helpful to have occasional talks on some of the basic skills needed for any researcher.  Two subjects he suggested were presenting and posters.

Not only was James forthcoming with his suggestions, he was also prepared to back them up by volunteering his services.  He kindly offered to lead this month's discussion on presenting.  He provided a useful blog entry in advance of the session which I took the liberty of editing a little.  James' original entry asserted that it is important for young researchers to develop the ability to communicate details of their projects.  I edited it to read "All researchers need to learn to communicate details of their projects to a wider audience."





Thursday, 12 September 2013

Filtering the Web (by Simon Wakeling)

The emergence of the internet has allowed its users a previously unimagined access to information. A corollary effect of this information deluge has been the notion of information overload – the difficulty those users face finding, filtering and processing relevant information from amongst the sea of available content. The web services that have emerged to connect us with this information, from search engines to social networks, have in recent years increasingly turned to personalisation as a means of helping users successfully navigate the web. 

In many cases this personalisation takes the form of algorithmic content filtering, with implicit user feedback channels ranging from geography, to click history, to the activity of other users within social networks used to determine the information presented to the user (indeed it has been suggested that Google uses up to 57 different data sources to filter its content). An unintended consequence of this process, which it should be noted is rarely made explicit to the user, is to gradually and invisibly disconnect information seekers from contrary content. A much used example concerns the political liberal, whose regular consumption of left-wing sources and active online network of similarly minded friends eventually sees any conservative material excluded from his search results and news feeds. The result is a “filter bubble” – a term coined by Eli Pariser to describe the self-perpetuating results of the information intermediator’s algorithm. Pariser sees web users inhabiting “parallel but separate universes” in which the absence of the alien undermines objectivity.  

It is interesting to note that philosophers have already begun to engage with this issue. From an epistemological perspective, Thomas Simpson has noted that in an age when search engines are increasingly seen as “expert”, an inbuilt and invisible confirmation bias increases the potential for beliefs to be formed on the basis of “reasons unconnected with the truth”. Others have begun considering the ethical implications. Is the filter bubble a form of censorship? Do we as users have a right to expect objectivity from the systems we use?
 
These are interesting questions for Information Scientists to consider. First though, as scientists we might want to investigate if indeed the phenomena exists, and if it does how it can best be measured and its implications studied. Is it necessarily a bad thing? We might also begin imagining solutions. An example of work already undertaken in this area is Balancer, which presents a stick man leaning under the weight of the “liberal” or “conservative” news sources the user has accessed.  It’s certainly novel, despite its limitations and acknowledged imperfections. Perhaps though the significance of the Filter Bubble problem requires more revolutionary solutions. We just need to work out what they are…

Monday, 9 September 2013

Decision making for microbes

A long, long time ago, in a life far far away, I spent several years working as a researcher in the life sciences.  When I shifed into information sciences, I found myself looking at aspects of the subject from a biologist's perspective.  

One of the aspects that I found myself considering was that of decision making.  Living things benefit if they can make best use of the resources that surround them.  To do so, they need information.  Or so I suggested in a paper that I wrote ten years ago (Madden, 2004) in which I argued that when organisms evolved the ability to move (and possibly before), they evolved a need for information.  

As is always the case at the discussion meetings, the talk went in some interesting directions.  Unsurprisingly I was pulled up for my lax use of the term "World View".  I had quoted Checkland's assertion that "Judging from their behaviour, all beavers, all cuckoos, all bees have the same W[orld View], whereas man has available a range of W[orld View]s." (Checkland, 1984, p218).  It was, I conceded in the discussion, a questionable assertion that I had not questioned.


Checkland, P.B. (1984). Systems Thinking, Systems Practice 2nd Edn. Chichester:

Wiley.
Madden, A. D. (2004). Evolution and information. Journal of Documentation,60(1), 9-23.

Sunday, 21 July 2013

Peer review, failure and the benefits of persistence

Earlier today I found myself reflecting on our recent discussion on peer review after I came across the reviews that accompanied the rejection of a paper I submitted a while back (see below).  The second review, though somewhat scathing, at least offered a few pointers to what the reviewer thought was wrong. The first reviewer obviously did not suffer from self doubt.  Her/his comments were wholly unhelpful.

The paper was later published (albeit in a less prestigious journal) and was recently nominated for an outstanding paper award.

It could well be argued that I had submitted to the wrong journal.  Clearly the audience of The Electronic Library was more receptive than the reviewers whose comments are published below.  However, it would be helpful if reviewers could be reminded of the value of supporting their statements with evidence, and discouraged from presenting opinions as facts.  Referee 2 asserted that "The presentation and argumentation is poor by any standards".  This was a rather arrogant extrapolation.  The reviewer's standards were presented as indicative of all standards.  Such practice is common amongst journalists, but should, I feel, be avoided by academics, even under the cloak of anonymity.  

Fortunately for me and my co-authors, Referee 2's belief that her/his standards were universal proved not to be the case.

Referee 1
This is very weak in many ways: the related work is unacceptably shallow, the experiment itself may be fatally flawed (I cannot tell from the current description but it feels that way) and the results are trivially presented. Even if they fixed the above I think the design and their current approach to investigating this issue may be problematic for publication; we already know a lot about relevance criteria and evaluative judgments and I doubt their approach is going to give much that is new. This almost assumes that nothing has been done in this area and their experiment feels very naïve. The only (relatively) new aspect is the meta-cognitive approach but they need to do something sensible with this rather than treating it as a buzzword. 

Referee 2
I recommend that this paper not be accepted by the journal. The paper has many serious problems that need to be addressed and lead the reviewer to reject the paper.

The presentation and argumentation is poor by any standards, and well below what would be expected of a journal like [name of journal].The paper needs a good abstract that provides a good summary of the paper. The current abstract is poorly written, and gives little insights into the key findings of the study. Most readers will only read the abstract and not read the complete paper unless the abstract is good.

The introduction section is weak: the paper is not clear as to the research problem, and why it is important and significant. The paper has no clear theoretical framework, nor an adequate literature review. Research questions not clearly spelled out.

The research design section of the paper is poor for an empirical study. This section is inadequate and does not reach the level that another researcher could replicate the study. Needs a data collection section and a data analysis section.

The Results section is unclear and not comprehensible. 

The paper has no Discussion section that discusses the key findings in relation to previous studies. Where are the limitations and implications?

Much of the Conclusions section needs to be in an earlier section. No further research is discussed.

The references section is messy, not in alphabetical order and has missing references.

Overall, the quality of this paper is poor in structure and content. The paper lacks a theoretical framework, is a poor empirical paper and is embarrassingly weak for a paper submitted to [name of journal]  I recommend that the paper be rejected.

Tuesday, 9 July 2013

Introducing KNIME (by Edmund Duesbury)

This blog entry could revolutionise the way you do information-based research!

KNIME (pronounced [naɪm/] and rhymes with "time") represents an exciting step forward in programming and the development of scientific tools for information sciences.  KNIME is a freely available, open source workflow tool which allows users, in a few clicks, to develop "programs" which can 
  • generate statistics from given input data,  
  • perform visual analyses and
  • generate graphs and web-based reports.

KNIME has built into it a wide variety of program libraries. Elements of these can be dragged and dropped into a workspace, enabling users to create a program in mere minutes.  These are not only useful, they look good as well!  Because KNIME makes use of visual representations of program components, it's incredibly easy to use.  There is no need to mess around with UML or pseudocode, and planning of workflows becomes straightforward.

In this session, James Wallace and I will try to introduce you to KNIME, and give an example of how to obtain simple statistics from some input data.  We’ll also demonstrate a classification method for plants which anyone can implement and which is easy to understand. Please feel free to come along, ask questions, and have a play with KNIME.

More information, including download links, can be found at: http://www.knime.org/knime






Friday, 24 May 2013

Tara Brabazon podcasts

I just came across a series of podcasts done by the Australian academic Tara Brabazon, which include a number about doing a PhD. The lastest is "Tara's ten tips for a PhD oral examination" http://tarabrabazon.libsyn.com/webpage/category/podcasts

Tuesday, 21 May 2013

Peer review, citation inflation and 'power citing'

This month's discussion began as a conversation about the value and validity of peer review, but drifted into a  reflection on citation inflation.

There was a clear divide in the group.  Those yet to publish in peer-reviewed journals were more trusting of the peer review mechanism than were those of us who had, at some time or other, had to cope with reviewers who clearly failed to understand what they had read and yet still felt qualified to demand changes.  Or worse still, reviewers whose remarks were not in any way helpful and appeared gratuitously vitriolic.

Unfortunately, such behaviour is an unpleasant side effect of the anonymity of peer review.  Anonymous peer review is often thought to be a long established practice but when, several years ago, I tried to find out just how far back it goes, I was surprised to discover that nobody appears to know, but it was probably introduced after World War II.

Angharad mentioned that, in her experience of being on the editorial team for Library and Information Research, it was not uncommon for reviewers to suggest to the author of the paper under review that it would be improved if it referred to the reviewer's work.  That led to a shift in topic and we began discussing the reasons why papers are cited.

I wasn't alone in being frustrated by the growth in number of articles cited in papers nowadays.  As an example of the growth, when I tried looking through early issues of the Journal of Documentation, I discovered that, throughout the forties and fifties, hardly anyone cited anything.  As is clear from the graph below however (plotted using data froSingh, Sharma, & Kaur 2011), things are very different nowadays. 

No doubt some of the work cited in a paper is genuinely useful to the author of that paper. However, I've been guilty in the past of including papers just to show that I was aware of them, rather than because they added much to my thinking or understanding.  I know from talking to other researchers that this is not uncommon.


One practice I would like to see introduced is that of power citation.  As well as nominating keywords from their article, authors could nominate up to five references which they found to be particularly valuable when compiling their article.  It could act as "edited highlights" of the references and provide guidance to anyone wanting to know where to begin if they wished to read around the article.  If the practice became widespread, it may also prove a useful bibliometric tool.

Maybe Library and Information Research can be persuaded to pioneer the practice.

Madden, A. D. (2000). Comment When did peer review become anonymous?. Aslib Proceedings 52(8) 273-276 
Singh, N. K., Sharma, J., & Kaur, N. (2011). Citation analysis of Journal of Documentation. Webology8(1)

Thursday, 9 May 2013

Post viva questionnaire - responses from Joanne Bates


What is the title of your thesis?
The Politics of Open Government Data: a neo-Gramscian analysis of the United Kingdom’s Open Government Data initiative.


Can you provide an abstract (for inclusion in this blog)?
See abstract


How long did you spend preparing for your viva?
Intermittently -when I could find the time - for about a month (mostly re-reading the thesis and finding likely questions on the internet) and then a heavy burst for a week or so before.


How long did your viva take?
About an hour and half to two hours.

Is there anything you wish you had done differently?
I’d got a bit addicted to using the word ‘emergence’ in my thesis – this was picked up by my external and led to a quite a few questions about causality, which I probably could have avoided…see the next question!

Did the examiners concentrate on any particular section of your thesis? If so, which?
My external was a political economist, so there were a lot of questions on political and economic theorists that I hadn’t drawn on even though they were relevant to the topic. Luckily I was aware of all the ones he mentioned and had a defence for not using them in my work – I think he just wanted to check I had a broader appreciation for the field than you can articulate in a single piece of work.

Also, there were a few questions on how I perceive causality in relation to social structure and agency. There were a few inconsistencies in some of the words I’d selected (i.e. emergence) and my discussion of agency. It was a good critical point, and it was good to be able to make clear my argument for the examiners.


Most of the discussion was actually quite theoretical or conceptual, rather than focussing on my methods or specific findings etc – I didn’t have to open my thesis or notes once, which I was surprised about.


Can you describe any part of your viva where you were pleased with your performance?
One of my key concepts – neoliberalism – I’d not given a detailed enough definition of it in the analytical framework of the thesis. Thankfully, I’d prepared a good response to the “define neoliberalism” question, and I got asked precisely that close to the start of the viva. My response was solid, without any wavering, and the examiners seemed pleased with it – so that gave me confidence moving forwards.


What was it you did that pleased you?
Being able to answer the question succinctly without tying myself in knots – it gave me confidence.


Can you describe any part of your viva where you were dissatisfied with your performance?
Not really – I was pretty happy with it in general.


Please give an example of a question that you found hard.
I said something about equality or social justice or something along those lines, and one of the examiners said – “Do you mean emancipation? Who do you want to emancipate?”


Why was it hard?
Well that’s a political and theoretical minefield of a question … I tried to give some sort of succinct response, but I don’t think it was the high point!


What was the outcome of your viva?
Passed with a couple of minor amendments - and some advice on further amendments to consider if I want to try and get a book contract.


Please give some examples of the sort of corrections you need to make (if any).

More in depth definition of neoliberalism – a key analytical concept I’d used (don’t overlook the obvious when you’re reading your own work!)

Draw out the analytical framework more throughout the body of the empirical chapters.


Do you have any tips for looking and feeling confident in front of the examiners?

I put myself on a hard core stress aversion regime the week before my viva. I swam or climbed every day to kill the adrenaline. I stopped working at 5 the evening before, tired myself out with exercise, slept well, and then went for a walk in the morning before heading into university. I also wore a new outfit, and got my hair cut!

I think mentally you need to have confidence in your work and your ideas, that you know your key concepts and arguments (and, whatever else you are likely to get asked about depending on your field of research) and be able to talk about them. I practised by answering (and asking) questions out loud, rather than just reading my notes.

If you go in with a good thesis that’s going to massively increase your confidence – so the hard work is actually before the viva.
Be yourself!

Can you think of any good advice that you would give to students who are preparing for their viva?
Remember that your examiners are people too; they want to see you get through even when they are grilling you on something.

I used loads of practice viva questions I found on the web to help me prepare – didn’t get asked most of them in the end, but it was useful nonetheless.

Enjoy the discussion, and remember you know more about this particular research project than they do - your examiners have probably only spent about a day preparing for the viva.

That bit that you are stressing about them asking loads of hard questions about – you’re probably being paranoid and it probably won’t even come up, so make sure you don’t over concentrate on it when preparing. Ask your supervisors if you’re unsure whether you should be worried about it.

Make sure that relaxation time is totally embedded in your preparation plans.  

Have fun plans for the evening that you can look forward to!

Sunday, 7 April 2013

Post viva questionnaire - responses from Rita Wan-chik

Rita Wan-chik successfully defended her PhD thesis on 28th February 2013.  She agreed to complete our questionnaire, and her responses are below.  Many thanks Rita!

What is the title of your thesis?
Religious information seeking on the web: A study of Islamic and Qur’anic information searching.

Can you provide an abstract (for inclusion in this blog)?
See Abstract.

How long did you spend preparing for your viva? 
To be honest, I only started to prepare for the viva about a week from the viva date although I wish I could’ve started earlier. I’d planned to read the whole thesis at least 2 or 3 times but only really managed to do it ONE time. Other than reading my own thesis, I also read all the papers that have greatly influenced my research and papers by both examiners (their selected studies which are related to my research). I also tried to answer some of the popular viva questions which I have compiled from the web and from friends.


How long did your viva take? 
Almost 2 hours. It was an afternoon session.


Is there anything you wish you had done differently? 
I wish I could’ve planned better for my research methods and data collection and that I should’ve started writing my thesis way earlier so that I didn’t have to dump everything at the end. I am one of those people who wait and only write when there are a lot of things to write about. 


Did the examiners concentrate on any particular section of your thesis? If so, which? 
Yes, the concentration was mainly on my Literature Review, Methodology and Discussions chapters. The examiners thought that my LR needed more depth and breadth to the discussions of the literatures and that the scope of my LR was not clearly indicated. We also discussed on the justification for choosing a mixed-methods approach and the data collection techniques employed in my study.

Can you describe any part of your viva where you were pleased with your performance? What was it you did that pleased you? 
I was quite pleased with myself on the way that I managed to answer almost every question asked by the examiners, although only the would know which responses were really answering their questions (J), but I did try my best to explain. I was also pleased when one of the examiners pointed out that I have done quite a great job with one of the data analysis techniques carried out in my study (although I still need to add a lot more clarifications on some of the steps taken to improve the repeatability of my research).


Can you describe any part of your viva where you were dissatisfied with your performance? 
I am pretty much satisfied with my performance and the whole viva process.


Please give an example of a question that you found hard. 
N/A


What was the outcome of your viva? 
The examiners were happy to recommend the award of PhD to me subject to minor amendments.


Please give some examples of the sort of corrections you need to make (if any).

  • Further detail on the procedure for the interviews, e.g., on how follow-up questions were asked (p.37), to probe responses, would be helpful.
  • “Throughout … lengths.” On page 39 requires clarification. 
  • Page 41. Further detail on how the codes, subcategories and themes were developed, e.g., bottom-up or top-down, would be helpful. 
  • A diagram of the final themes, categories and codes should be provided in the Results chapter.
Do you have any tips for looking and feeling confident in front of the examiners?

  • Get good sleep and rest on the day and night before the viva.
  • Don’t force yourself to try to read everything or as much as you can a few days before the viva.
  • Start preparing early so that you can do it gradually. You will know when you just need to stop preparing.
  • Don’t stress yourself out.

Can you think of any good advice that you would give to students who are preparing for their viva?

Talking to your friends about your study can help you to answer confidently in viva. It is like doing rehearsals, only it’s unofficial and more relaxed. You can try to improve yourself on the answers which did not get good responses. By trying to answer the popular viva questions (you can find this from the web), can also help you to prepare for the viva. It did help me.

I’d also like to add, for those who are still in the process of writing their thesis, make sure you get somebody within your research group, other than your own supervisor(s), to read your final thesis. Let them comment on things that need more clarification and then make those amendments accordingly. People within your group may be able to see the ‘holes’ that you and your supervisor(s) may have missed or underestimated. 

All the best to you!