THE BLOG

03
Feb

Planning an online survey – tips and tricks (1)

I realised I haven’t blogged for a while, so I thought I’d start a series of useful posts about designing online surveys. These tips and tricks are based on my training materials and I hope they can be useful to you.

So, first things first. It’s important to consider five things when planning an online survey:

1. Population: Who is your target population? How many people are there in this target population?

2. Objectives: What is the main reason for running this survey? What’s the single most important focus? Write this down, and don’t be tempted to bloat the question set with loads of other questions.

3. Ethics and data protection: How can you treat people fairly, and how will you store and handle data now and in the future?

4. Timeline: How long have you got to plan it, run it, and analyse results?

5. Sampling method: Who to contact, and how? How many responses do you need to get valid data that represent the population?

Everyone who plans a survey should get into the habit of documenting the answers to this question.

To help this process I’ve created a single page A4 planning document. Please feel free to download the survey plan document and use for your online survey planning.

Watch this space for the second update of tips and tricks in the next few weeks!

24
Mar

The G Shed – our new home

Our new ‘G Shed’ Bristol city centre office is up and running, with some lovely sofas from The Bristol Guild and some even more wonderful wall painting courtesy of Emily Ketteringham.

As of 24th March 2014 we still have two desk spaces if you know of anyone who wants city centre office space. They would be sharing with us, Pure Usability, 3Sixty – a web design / experience co. and Steven Dodds – a third sector marketing guru with a love of teapots. Please get in touch if you’re interested!

1234750_10152645594825410_138027509_n

12
Nov

Online surveys: three examples of what not to do!

Fowler (1995) said “Improving question design is one of the easiest, most cost-effective steps that can be taken to improve the quality of survey data”.

It’s certainly true that no other single action will have such a big improvement on your survey response rates, and yet every day I see terrible examples of survey questions. Here are just three examples of what NOT to do!

1. Don’t manipulate your data to get a positive result

The classic here involves use of the Likert Scale – the scale that aims to rate opinion. So for example you might ask people how much they agree or disagree with a statement, on a five point scale. Fine, this is a common survey question type. But hang on, surely there needs to be an equal number of positive and negative answers, right? Not if you’re GfK, who created this survey for Ikea customers! Tut tut, they might have given the client what they wanted to hear, but have they actually created any useful data with this? Something that their client can use to drive improvement in their business? I think not 🙂

Screen Shot 2013-11-12 at 10.17.07

2. Don’t use ‘sometimes’ or double negatives in your questions!

This seems basic, right? And yet I see this all the time. Imagine you’re asked to agree or disagree with the following statement: “I’m sometimes overworked”. If you agree, fine. But what if you choose ‘disagree’? Does this mean you’re NEVER overworked, or ALWAYS overworked?
Using ‘sometimes’ and double negatives can get you into an awful lot of trouble. Just look at this example asked by the US Jewish Committee in 1992. Needless to say their national survey led to some alarming and untrustworthy results!

Screen Shot 2013-11-12 at 09.56.24

3. Don’t ask unrealistic questions

Check this one out, from Virgin Media. I’m a Virgin Broadband customer, so they sent me a survey. But how on earth am I supposed to know how Virgin’s broadband speed compares with other suppliers?! They know I’m a customer of theirs. Ipso facto I have no experience of other providers. This question is a waste of my time and results in invalid and unhelpful data.

Screen Shot 2013-11-12 at 09.43.01

For more examples of what not to do – and how to get survey design and analysis right – why not come along to the next training event in Bristol on 11th December? More details available here.

21
Oct

Office of National Statistics infographics

I was having a wander around the ONS site today and liked the ‘on the fly’ interactive maps and charts that you can play with from the 2011 national census data.

On the plus side, really nice to see, and can be easily embedded into other sites via the code they provide (see example below – though the window is small).

However, I think they are lacking some basic vital information and are therefore open to being misinterpreted. First, the colouring of the labels. If you look at the data in the maps for Passport holders in the UK (see below), when we look at the number of people with a UK passport in districts across the UK the dark purple is used for 90+ people. The same colour shade is used to describe 3+ people with Polish passports in each area. If you didn’t look at the key you could easily think we were being invaded by Polish (I can think of some relatives who would jump quickly to that assumption 🙂

Second, with the drop down list of nationalities, why does it start with Poland? Why isn’t it alphabetical? It feels politicised.

Finally – there is no explanation of that these figures actually mean. I assume this is ‘out of 100 adults’ so the data for dark purple for UK passports is therefore explained as “90 or more out of one hundred adults” and the Polish dark purple data as “three or more out of one hundred adults”. Then again it might include children, but I don’t know because it didn’t tell me!

So close ONS. Might drop them a request to add these vital extras in!

Graphic by Office for National Statistics (ONS)

12
Sep

Post #2: An evaluation of funding to support organisational effectiveness in Further Education

Earlier this year I was employed by the Learning and Skills Information Service (LSIS) to evaluate the impact of two funding strands on the Further Education (FE) sector. This is the second post that focuses on the evaluation of the ‘Organisational Effectiveness’ (OE) funding scheme.

This involved studying 50 projects, each running for one year, that were funded by the OE scheme. The aim was to identify factors that correlated with project success or failure, and provide recommendations to assist the FE sector and future funders. With the closure of LSIS, and with a new FE funder only just on the horizon (the ETF: Education & Training Foundation), I thought it would be useful to share the findings of this research.

The OE grants, of up to £25,000 (typically £20K) provided by LSIS, were intended specifically for use within leadership, management and governance. Most projects focussed on both improving efficiency and increasing effectiveness.

Broadly speaking this study found that the FE sector – particularly GFE Colleges – are coming round to the realisation that in a competitive environment they need to be more efficient, provide realistic and useful system measures, and prove increases in efficiency/effectiveness to both themselves and others. There is no doubt that the Organisational Effectiveness LSIS funding strategy provided a very useful platform for funded FE organisations to begin to realise this need.

A link to the full report (including 6 examples of successful projects) is provided at the bottom of this post. The most interesting findings of the LiT evaluation were as follows:

1. Return on investment was high (I was surprised how high)

Estimated savings per organisation in Year 1 varied between £1,750 and £412,000, with a median average of £43,565. If we remember that the average investment was £28,500 per project (£20,000 from LSIS and £8,500 from the organisation) this gives us a Year 1 ROI of 153%. If these per annum savings continue into Year 2 this increases to an average ROI of 305%. Only 6% of projects failed, so nearly everyone got something genuinely financially positive out of the experience.

I was amazed (and I am not usually amazed!). In fact, in some cases I was blown away. Many suggest that FE institutions should be funding change management themselves rather than awaiting support from external funders. Fair enough perhaps. But I had to admit that the gains to the end user, the institutions and the sector were phenomenal. And many said they wouldn’t have ‘upped their game’ if it weren’t for the presence of an external funder with high expectations and a clear project structure and deadline.

2. Savings are set to continue at least into years 2 and 3

Most projects anticipate savings to continue or grow throughout Years 2 and 3. The evaluation confirmed that initial predictions were reliable at least into Year 2. In some cases predicted savings were huge; for example one project (#31: Reducing curriculum delivery costs by teaching common QCF units across programmes and departments”) is on track for a ROI of over 600% by Year 3 – that equates to over £500,000 saved in three years.

3. The project gave organisations confidence to promote further change.
The survey showed that in 71% of cases the project had catalysed further positive organisational change. Examples are listed in the report.

Sadly results and findings were not published through LSIS as they closed as this project came to an end. However I have blogged the report here and will let ETF know about the work (they may already be aware, but anyway). Please pass this on to others who might be interested!

Link to the Organisational Effectiveness impact study (June 2013)

12
Aug

Post #1: What makes a successful technology project in FE? Advice for funders and the funded

Earlier this year I was employed by the Learning and Skills Information Service (LSIS) to evaluate the impact of two funding strands on the Further Education (FE) sector. This involved studying the 100+ projects funded by one or other scheme to identify factors that correlated with project success or failure, and identifying recommendations to assist the FE sector and future funders. With the closure of LSIS, and with a new FE funder only just on the horizon (the Education & Training Foundation), I thought it would be useful to share the findings of this research.

This first blog post focuses on the Leadership in Technology funding strand. The Leadership in Technology (LiT) grant scheme, run each year by LSIS (2010 – 2013), offered FE providers the chance to bid for and win a small amount of funding for the efficient and effective use of technology within their specific Further Education context.

The grants, of up to £6,000, were intended for use in one of two possible contexts: for teaching, learning and assessment purposes; or within leadership, management and governance. Half of the funds had to be used to employ a mentor to coach and assist the funded organisation in the implementation of the technology and new ways of working. About 60 projects were funded across the three years.

A link to the full report is provided at the bottom of this post, together with seven narratives that give case studies of successful projects. The most interesting findings of the LiT evaluation were as follows:

1. Small funding initiatives have the potential for significant impact within FE organisations. The LiT grants were a very good use of limited funds: small pots of money allowed the inclusion of a large number of organisations within the FE sector, and the short timescale (one year) drove rapid change and maintained focus. The scheme provided just enough money, time and expertise (via the mentor) to give an organisation the impetus to step away from front-line provision and spend time focusing on ways to improve their service. Without this type of funding it is likely that many opportunities for up-skilling and enthusing employees and learners, improving efficiency and effectiveness, and discovering ways to save significant sums of money would have been lost.
My opinion: If I were considering how best to fund future projects I would definitely recommend this ‘small amount of money across lots of organisations’ approach. Many might argue that a project needs a budget of tens of thousands to have a chance of creating any impact. However, this small grant still attracted three bids for every successful grant holder, showing there are organisations prepared to bid for small pots of cash (particularly those in the Adult Community Learning or Work-based Learning sub-sectors). Most importantly, the outcomes were often as significant as those I’ve observed in projects with far larger budgets.
One thought – the project management involved in organizing and evaluating 60 projects shouldn’t be underestimated, but I created efficient ways to do this (e.g. plotting project names and vital details geographically using BatchGeo – and colour-coding things like sub-sector – was really useful both for me to keep track of projects, and as a communication tool to discuss projects with others such as LSIS).

2. We must never underestimate the difficulty in integrating new ways of working into embedded cultural and organisational practice. This was the largest blocker to success, and many project holders reported that they wished they had spent more time getting staff and management buy-in for the project.
My opinion: All too often the project action plan focuses only on what will be done and how. It totally fails to consider people – who will be involved, who will be affected etc. My main recommendation was to include people in the action plan – this is central to change management / systems thinking approaches and often has profound effects (see my up-coming blog post about the Organisational Effectiveness projects). When I was asked to give the keynote speech at the LSIS ‘Technology for Success’ project I created a slide that gave a few practical questions I thought should be included in future action plans – check out the full presentation on slideshare, or the relevant slides below:

im1

im2

3.Dissemination events should be organized by the funder, and be compulsory for all project holders. Evidence showed that dissemination outside of an organisation rarely occurred naturally other than via ‘known networks’ to which the organisation already belonged. This may be because known networks have a history of give-and-take, meet regularly to share and exchange knowledge, and thus are considered less threatening than just advertising methods of success to potential commercial rivals. Events such as the LSIS Technology for Success conference offered an invaluable way to encourage people to share, network, discuss and disseminate their practice.

4. Project mentors were vital: they were experts in a specific problem, and were listened to because they were external to the organisational culture. All the LiT projects had to name and fund a mentor, who would coach the organisation through the process of identifying and integrating a technology solution. These individuals were identified by the funded organisation as having the skills and knowledge they needed. Mentors provided a method to rapidly up-skill organisational representatives and assist with cultural change. They tend to be listened to far more than those employed internally.

5. Forget ‘cutting edge’ technologies – funding the basics can still have massive organisational benefits. This was particularly true of organisations that were spread across many geographical locations (e.g. adult community learning sites across a county). They tended to apply for LiT funding in order to implement basic ICT solutions (e.g. to replace face-to-face training with an online presentation to teach tutors how to use a whiteboard) rather than experiment with new technologies.
My opinion: Whilst these projects might not feel as exciting to some funders, when successful they made some of the largest economic savings of all LiT projects. Many FE organisations have not yet had the time to step away from front-line services and consider how to integrate basic technologies. Given the chance, this can save thousands of pounds. It’s not sexy, just common-sense use of technology, but if you’re interested in ROI then these kinds of project – especially for work-based or adult community learning sub-sectors, and/or in organisations with several geographical locations – should be top of the list.

Here are the links to the full report, plus the seven case study project examples:

Full report
This includes seven examples of successful projects in the appendix, these being:
Narrative 1: A successful offline e-portfolio
Narrative 2: WBL successful VLE and culture change
Narrative 3: Sixth form embracing mobile use to increase engagement
Narrative 4: iTunesU and QR codes in FE College reach out to students outside the classroom
Narrative 5: Using video to evidence progress in learners with disabilities
Narrative 6: Shift to ICT CPD materials in face of restructuring and downsizing
Narrative 7: How, when and why projects go wrong

07
Feb

Presentation at the LSIS ‘Technology for Success’ conference

I spent Tuesday at the LSIS Technology Conference, which proved to be an interesting and valuable day out. I met lots of interesting people, heard about some interesting projects, and was fortunate to be asked to give one of the keynote talks.

My talk covered the preliminary results of a project I’m carrying out for LSIS at the moment. It’s an interesting topic, focussing on the impact and sustainability of small funding initiatives – basically how to get the most out of small funding grants (£10K or lower). My talk covered issues such as:

  • What does success look like?
  • What factors are associated with successful projects?
  • What factors correlate with the project catalysing more positive changes within an organisation?
  • Why do things go wrong?
  • One practical way to deal with an important blocker: include ‘people’ in your action plan.

    Several people have asked for copies of the presentation, so I’ve uploaded a version to SlideShare (see below).

    Note that this work is ongoing – please check back here over the next few months for the full report and summary information.

  • 19
    Dec

    My first workshop on survey design gets a thumbs up!

    Thanks to everyone who came along to the first workshop on effective survey design, held at the MShed yesterday (18th December). It was lovely to get such a positive response.

    I promised delegates that I would make the materials available to them, together with my POETS survey planning sheet. These can be downloaded below, but please note that they are covered by a creative commons licence that allows you to share them non-commerically so long as you acknowledge the original source:

    PowerPoint slides of the workshop
    POETS planning tool

    For those interested in attending the next workshop, I can now confirm that it will be held at the MShed on 30th April – details from the Timmus events Eventbrite webpage.

    Below there’s a photo kindly taken by Mike @ BOS – a big thanks to both Mike and Tom Quinn for coming down and supporting the event.

    Finally, for those interested in the possibility of using mobile phones for data collection (Nadia I didn’t forget!) please check out FormAgent’s website. They are based in Southampton and my other half Stuart Church met them a few weeks ago – he says they are really nice guys and it’s a simple but clever product.

    Hope to see you in April 2013!

    Cheers,

    Tabetha

    Tabetha presenting the first ‘effective survey design’ workshop

    20
    Sep

    Online survey design and analysis – two Bristol-based training workshops coming up!

    Over the past two years the contracts I’ve worked on have increasingly focussed on survey design and analysis, to the extent that this now makes up around 75% of my work. There used to be a lot more primary research about (literature reviews, data collection in the field, interviews etc), but the current economic climate seems to have forced people into using the cheapest methods of data collection, and this often results in the creation of online surveys.

    Clients frequently ask me to analyse survey data after it’s been collected, which can make things tricky if the survey design wasn’t so hot in the first place! This often results in clients subsequently requesting I re-design the original survey tool in order to make it robust enough to survive several years “un-tweaked”.

    Getting the design of a survey right is vital, especially as many clients need to gather benchmarking metrics – if questions are tweaked each year it becomes impossible to compare between years.

    This got me thinking about teaching survey design and analysis. So many sectors and trades now use online surveys, people in wildly different job roles can be responsible for them, but how do people learn to design and analyse surveys effectively?

    It’s easy to run a bad online survey. The web is littered with them – surveys that are way too long, questions that are ambiguous or irrelevant, pop-up requests that are annoying. Such surveys only serve to annoy customers or staff, and collect data that is unlikely to be valid or representative.

    Similarly, there are often problems with survey analysis. Reports often only contain mean averages and simple graphs, making it all too easy to misinterpret information, report findings incorrectly, or miss subtle data trends. Even when a difference in averages between groups does appear, how do you know that difference is real and not due to chance alone?

    So I decided it was time to organise a couple of half-day workshops, one about online survey design and a second that focuses on survey analysis. They’ll both be held at the beautiful MShed building in Bristol in December and January. I’ve also started offering them as an in-house training package.

    I’ll let you know how it goes! And if you’re interested in attending, click here for more info’.

    21
    Feb

    Teachers use of digital resources: drivers and blockers

    The trigger for this post – a paper on teachers’ use of digital technologies in Singapore
    Earlier this week I stumbled on an interesting paper about teachers in Singapore in a recent issue of BJET. The authors were interested in whether teachers were following the recommendations about integrating digital technologies into their teaching practice.

    They asked student teachers to complete a survey asking them how much they tended to comply with requirements to use digital technology, and their perceived sense of competence, value and frequency of digital technologies as a teaching aid.

    Interestingly they found a negative correlation between compliance and technology use, indicating that “teachers who were more competent at using digital technologies were less likely to be compliant in the rule for using them in class, and those who were compliant may not actually apply digital technologies in their teaching”.

    In contrast, a teacher’s perception of their competence and the value of digital technologies were positively correlated with how often they used technology in class.

    This interesting finding suggests that mandating the use of digital technologies – telling a teacher what digital technology to use and when to use it – may not be the best way to get technology into teaching practice (surprise surprise!).

    Indeed the authors state that “a more productive approach may be to enhance the competence of teachers in the use of digital technologies so that they value its effectiveness and are confident to apply it in classroom activities”.

    Drivers and blockers to teachers’ use of digital technologies
    This got me thinking about a review I did for Becta called “Teacher use of digital learning resources: a summary of the drivers and blockers to resource use”. It was completed in 2007 so it is getting on a bit now, but it’s worth summarising some interesting points.

    After a pretty extensive review, I decided that the drivers and blockers appeared to exist one one of four ‘levels’ these being:
    1. Technical (e.g. having access to the internet from the classroom)
    2. Organisational (e.g. Schools promoting the use of digital resources)
    3. Teacher/process (Teacher time constraints, confusion and concerns relating to e.g. copyright)
    4. Teacher/emotional (e.g. not feeling valued for creating resources, fear of looking stupid in front of peers or students if something goes wrong)

    Who owns a teacher’s digital resources?
    The thing I found interesting was that, based on the many anonymous responses I received from teachers after posting for comments on teacher discussion forums, the most powerful blocker related to emotional issues. This was either a fear of looking foolish, or a resentment that resources would not be valued – and might even be stolen – by other staff or by their school/LEA. And yet I couldn’t find any literature about this issue anywhere! Hence why I canvased the discussion forums. Here’s a couple of quotes from the replies I received:

    “I’m usually happy to share resources with people and contribute to forums such as SLN. However my HOD barely ever speaks to me or thanks me for anything I do. Imagine my surprise when I find that my HOD has copied files (without my permission) from my disk and then passed my work off as theirs for an area of the curriculum they are responsible for planning… It was my disk, the work was produced on my computer and in my own time so I think it’s mine?” TES Community Forum (2007)

    ”I teach in FE and have developed computer-based courses such as English for Driving…It was passed on to XXXXX by someone within my institute. It is now selling commercially and the college is making money from royalties. What do I get? Not even a drop of recognition. How do I feel? Mixed emotions.” ICTResearch Email reply (2007)

    “I teach in an FE college in Scotland. My contract of employment states that the college owns the copyright of everything produced (even in my own time). Many Scottish FE colleges have the same or similar contracts. The only way to share anything is to break the contract – which many people do. As far as I know no one has ever had any problems – but it is hardly encouraging.” ICTResearch Email reply (2007)

    “I’ve just resigned from working with one particular LEA… for telling me that I can’t discuss ideas and problems on lists such as this in case I say something that ‘will reflect badly on the LEA’. This clearly makes it more difficult for teachers to share anything.” ICTResearch Email reply (2007)

    Makes for interesting reading, huh.

    Conclusions
    So what to conclude? In Singapore we see that teachers do not want to comply with mandated directions for use of digital technologies. They want to take ownership, to choose when and how they want to use digital resources to dove-tail into their teaching.
    In my review I found that when teachers did bother to make resources, they were often told that they belonged to the school, to the LEA, and in one case to the HoD :o)

    Surely if we want teachers to engage with using and creating digital resources – things that can in theory be swapped and shared and re-mixed – then we should ensure that they keep ownership, and that we train teachers about copyright issues and ways in which they can protect their resources whilst still sharing them.