Crowdsourcing Teaching and Learning Services: OpenIDEO in beta as a case study

OpenIDEO recently launched with a few beta projects aimed to promote social entrepreneurship – first for helping kids make healthy food choices and then for affordable teaching and learning services (in India).  The OpenIDEO web platform is a good use of social media to gather up precedents, promote participation, and organize preferences.  People are free to contribute as much or as little as they can, but as with any project, there are clearly different levels of participation.  Somewhere I read [from the EVOKE people I think] that there are usually five or so levels of participation in crowdsourcing or social media projects: 1) look around, 2) create an account, 3) some participation, 4) active involvement, and 5) hardcore.

Because I have an interest in teaching and learning, I decided to commit and follow through to the end – contributing as earnestly as possible with my available time.  I probably ended up somewhere around “active contributor”, but by no means was I “hardcore”.

I came in a little after the start of the project and didn’t have much time to contribute to the precedents phase.  Precedents is where people share examples of things that are relevant to the project brief. Here the brief was to increase the availability and affordability of teaching and learning tools and services in the developing world.

The brief is often where the closest attention should be paid. It’s usually where conflicts and misunderstanding originate. As with any project, the real challenge is to first define the problem – and then to demonstrate how the solutions posed solve that problem.  It sounds easier than it is.  I think crowdsourcing succeeds and fails in the ways people perceive and interpret the problem, and how they subsequently map their solutions to the problems as posed.  The challenge for any crowdsourcing project to embrace is how to support the interpreting and mapping more effectively.

This post is meant for me to reflect and assess what I thought was fun and what I thought was less fun about OpenIDEO’s process – as a user and participant.  Perhaps because the focus of the challenge was teaching + learning, I viewed it a little like being a student-participant.

What was fun.
The challenge was relevant and broad enough that I was able to easily focus my efforts into developing a few concepts. In most cases, I had the education settings and use-cases in front of me while I was doing my other work on rural agriculture and livelihoods. In all I added three concepts.  It was mainly a way for me think through problems, and I did it as much for myself as I did for the challenge.

In Share the Seed, Not the Tree, I collected data about the costs of materials and services in use at a typical school in a large town in Andra Pradesh, India. I wanted to use collected data and observations of kids at school because I thought this seemed to be missing from the brief, and because unsubstantiated assumptions about people and contexts are too common.  Among the many context submissions, there were a wide range of assumptions about context, affordability, meaning, and culture, and I didn’t really understand where they were coming from. But that’s okay. 

On the formal side, I think the developers should have made the formatting a little easier for the user.  As it was I couldn’t present anything in tabular or list format.

Untitled was a information tool for library services we’ve been working in at CSTEP which provides a simple to implement way of tracking library books and other assets.  Common resources like libraries and parks are REALLY difficult to maintain in India – unless you have a guard and locks.

One take-away lesson from the concept I sent in (and for OpenIDEO) was that I think teaching and learning will benefit more when the resources that are present are made visible with the rules and users clearly shown to all.  We need information technologies that simultaneously support different modes of interaction – from centralized to decentralized and everything in-between.

News Ecologies Remix Design (Figs 1 & 2) was as much an experiment with graphic design as it was thinking through the hovel industrial ecology of newspaper recycling and aggregation AND journalistic content creation.

What I really like in hindsight was the eventual use of the concepts – something that wasn’t made quite clear up front.  The ‘winners’ were all compiled into a resource guide that provided a series of steps and questions to help move subsequent innovators through the design process themselves.  The winning concepts were not projected as projects to be implemented – they were positioned more as catalysts for teaching and imagining.

So in the end, the brief ended up more like a rapidly prototyped workbook – filled out with design ideas.  The OpenIDEO platform was a quick way to generate relevant content that could be used to support people’s thinking as well as a process for local actors working on a similar design brief.

What was less Fun.
I have way more to say about what was fun and less fun, but because of time, I only want to focus on a few things that seemed consistent or inconsistent with the aims of the challenge.

On the less fun side, the social aspects of the platform were not as enriching as I expected.  There were ‘winners’ in a collaborative process, and this raises multiple issues as part of a larger discussion about framing, education and collaboration.

I also didn’t get a stable sense of interaction with other participants.  Keep in mind the platform is still in beta, and they are (I assume) working on additional “features”.  Inter-participant interactions consisted of comments on posts and “applaud” recognition.  I really wished I could have been notified by email of updates to comments and other interactions between participants.

I also got the sense it was a popularity contest.  This was reinforced in the evaluation phase where, after an intense round of concepting, forty concepts were shortlisted.  If I were a student in a classroom, this would have been really discouraging.  It was a like working to satisfy a set of criteria and then finding out afterwards that you were actually being evaluated against a different set of rules.

We’ve now got 40 concepts based on popularity and those which have the most potential, as chosen by GMC. In order to get down to 30, and help these ideas move forwards, please evaluate them against the criteria.

I think this is where OpenIDEO really failed with this challenge.  Most students at a certain age are not disappointed by not winning.  It’s not knowing how to improve that kills your motivation.  This is exactly the challenge for India.  Many teachers – especially at the college level – are themselves unable or unwilling to distinguish relevant knowledge and its applications from less effective ones.  What they do know, they stick with – leaving innovating educational models in the dust (quite literally sometimes). 

Experienced teachers also know that if students are uninformed about why they got a certain grade, they get upset and frustrated and will loose motivation quickly.  This is probably why standardized curricula and testing are used so much in schools – and why ‘progressive educationists’ react so strongly to any mention of evaluation or standards.  When no one has to be responsible for facilitating that map between problems and solutions, there are simple, correct and incorrect answers.

It would have been better to do the detailed evaluation first – giving feedback to all the concepts – and the “applause” round second – with the detailed evaluations available as evidence of the mapping between solution and problem. Yes, it would have been more tedious perhaps, but so what.

If I had know it was all about popularity, I probably wouldn’t have invested the effort. There was no way to ‘see’ the mapping between the problem statement and ‘winning’, making it appear as though arbitrary because it wasn’t made visible.  What I wanted was the opportunity to see if my perspectives matched the challenge problem and where it needed improvement.  So in the end, I didn’t learn much. 

But hey, it’s a beta test and failing is good.  Hopefully it becomes an opportunity for better implementation.

The second round of evaluation was more detailed and asked respondents to rate the solution on a few different criteria – along with detailed comments to further their effectiveness.  I don’t want to get too much into the feasibility of many of the ideas for India, but I will say that there could have been better alignment between the concepting phase and what schools and education are like in India.  I don’t want to be a downer on brainstorming, but I did feel like some of the social interactions were too encouraging, without providing any real interpretation of the costs, benefits, or obstacles that the solutions presented.  But then maybe that is ENTIRELY appropriate give the India-based context.  Perhaps providing a more detailed design brief along with supporting materials would be one way to provide such a diverse array of participants with more meaningful context.

In summary, it was fun, challenging, enriching, and I’d do it again.  However, because the social and evaluative aspects value certain actions over others, I am less inclined to contribute as fully as I might otherwise.  Nonetheless in it’s successes and failures, it’s a powerful example with lessons for the design of teaching and learning tools, values, and services.

Leave a Reply