OpenIDEO recently launched with a few beta projects aimed to promote social entrepreneurship â€“ first for helping kids make healthy food choices and then for affordable teaching and learning services (in India).Â The OpenIDEO web platform is a good use of social media to gather up precedents, promote participation, and organize preferences.Â People are free to contribute as much or as little as they can, but as with any project, there are clearly different levels of participation.Â Somewhere I read [from the EVOKE people I think] that there are usually five or so levels of participation in crowdsourcing or social media projects: 1) look around, 2) create an account, 3) some participation, 4) active involvement, and 5) hardcore.
Because I have an interest in teaching and learning, I decided to commit and follow through to the end â€“ contributing as earnestly as possible with my available time.Â I probably ended up somewhere around “active contributor”, but by no means was I “hardcore”.
I came in a little after the start of the project and didn’t have much time to contribute to the precedents phase.Â Precedents is where people share examples of things that are relevant to the project brief. Here the brief was to increase the availability and affordability of teaching and learning tools and services in the developing world.
The brief is often where the closest attention should be paid. It’s usually where conflicts and misunderstanding originate. As with any project, the real challenge is to first define the problem â€“ and then to demonstrate how the solutions posed solve that problem.Â It sounds easier than it is.Â I think crowdsourcing succeeds and fails in the ways people perceive and interpret the problem, and how they subsequently map their solutions to the problems as posed.Â The challenge for any crowdsourcing project to embrace is how to support the interpreting and mapping more effectively.
This post is meant for me to reflect and assess what I thought was fun and what I thought was less fun about OpenIDEO’s process â€“ as a user and participant.Â Perhaps because the focus of the challenge was teaching + learning, I viewed it a little like being a student-participant.
What was fun.
The challenge was relevant and broad enough that I was able to easily focus my efforts into developing a few concepts. In most cases, I had the education settings and use-cases in front of me while I was doing my other work on rural agriculture and livelihoods. In all I added three concepts.Â It was mainly a way for me think through problems, and I did it as much for myself as I did for the challenge.
In Share the Seed, Not the Tree, I collected data about the costs of materials and services in use at a typical school in a large town in Andra Pradesh, India. I wanted to use collected data and observations of kids at school because I thought this seemed to be missing from the brief, and because unsubstantiated assumptions about people and contexts are too common.Â Among the many context submissions, there were a wide range of assumptions about context, affordability, meaning, and culture, and I didn’t really understand where they were coming from. But that’s okay.Â
On the formal side, I think the developers should have made the formatting a little easier for the user.Â As it was I couldn’t present anything in tabular or list format.
Untitled was a information tool for library services we’ve been working in at CSTEP which provides a simple to implement way of tracking library books and other assets.Â Common resources like libraries and parks are REALLY difficult to maintain in India â€“ unless you have a guard and locks.
One take-away lesson from the concept I sent in (and for OpenIDEO) was that I think teaching and learning will benefit more when the resources that are present are made visible with the rules and users clearly shown to all.Â We need information technologies that simultaneously support different modes of interaction â€“ from centralized to decentralized and everything in-between.
News Ecologies Remix Design (Figs 1 & 2) was as much an experiment with graphic design as it was thinking through the hovel industrial ecology of newspaper recycling and aggregation AND journalistic content creation.
What I really like in hindsight was the eventual use of the concepts â€“ something that wasn’t made quite clear up front.Â The ‘winners’ were all compiled into a resource guide that provided a series of steps and questions to help move subsequent innovators through the design process themselves.Â The winning concepts were not projected as projects to be implemented â€“ they were positioned more as catalysts for teaching and imagining.
So in the end, the brief ended up more like a rapidly prototyped workbook â€“Â filled out with design ideas.Â The OpenIDEO platform was a quick way to generate relevant content that could be used to support people’s thinking as well as a process for local actors working on a similar design brief.
What was less Fun.
I have way more to say about what was fun and less fun, but because of time, I only want to focus on a few things that seemed consistent or inconsistent with the aims of the challenge.
On the less fun side, the social aspects of the platform were not as enriching as I expected.Â Â There were ‘winners’ in a collaborative process, and this raises multiple issues as part of a larger discussion about framing, education and collaboration.
I also didn’t get a stable sense of interaction with other participants.Â Keep in mind the platform is still in beta, and they are (I assume) working on additional “features”.Â Inter-participant interactions consisted of comments on posts and “applaud” recognition.Â I really wished I could have been notified by email of updates to comments and other interactions between participants.
I also got the sense it was a popularity contest.Â This was reinforced in the evaluation phase where, after an intense round of concepting, forty concepts were shortlisted.Â If I were a student in a classroom, this would have been really discouraging.Â It was a like working to satisfy a set of criteria and then finding out afterwards that you were actually being evaluated against a different set of rules.
We’ve now got 40 concepts based on popularity and those which have the most potential, as chosen by GMC. In order to get down to 30, and help these ideas move forwards, please evaluate them against the criteria.
I think this is where OpenIDEO really failed with this challenge.Â Most students at a certain age are not disappointed by not winning.Â It’s not knowing how to improve that kills your motivation.Â This is exactly the challenge for India.Â Many teachers â€“ especially at the college level â€“ are themselves unable or unwilling to distinguish relevant knowledge and its applications from less effective ones.Â What they do know, they stick with â€“ leaving innovating educational models in the dust (quite literally sometimes).Â
Experienced teachers also know that if students are uninformed about why they got a certain grade, they get upset and frustrated and will loose motivation quickly.Â This is probably why standardized curricula and testing are used so much in schools â€“ and why ‘progressive educationists’ react so strongly to any mention of evaluation or standards.Â When no one has to be responsible for facilitating that map between problems and solutions, there are simple, correct and incorrect answers.
It would have been better to do the detailed evaluation first â€“ giving feedback to all the concepts â€“ and the “applause” round second â€“ with the detailed evaluations available as evidence of the mapping between solution and problem. Yes, it would have been more tedious perhaps, but so what.
If I had know it was all about popularity, I probably wouldn’t have invested the effort. There was no way to ‘see’ the mapping between the problem statement and ‘winning’, making it appear as though arbitrary because it wasn’t made visible.Â What I wanted was the opportunity to see if my perspectives matched the challenge problem and where it needed improvement.Â So in the end, I didn’t learn much.Â
But hey, it’s a beta test and failing is good.Â Hopefully it becomes an opportunity for better implementation.
The second round of evaluation was more detailed and asked respondents to rate the solution on a few different criteria â€“ along with detailed comments to further their effectiveness.Â I don’t want to get too much into the feasibility of many of the ideas for India, but I will say that there could have been better alignment between the concepting phase and what schools and education are like in India.Â I don’t want to be a downer on brainstorming, but I did feel like some of the social interactions were too encouraging, without providing any real interpretation of the costs, benefits, or obstacles that the solutions presented.Â But then maybe that is ENTIRELY appropriate give the India-based context.Â Perhaps providing a more detailed design brief along with supporting materials would be one way to provide such a diverse array of participants with more meaningful context.
In summary, it was fun, challenging, enriching, and I’d do it again.Â However, because the social and evaluative aspects value certain actions over others, I am less inclined to contribute as fully as I might otherwise.Â Nonetheless in it’s successes and failures, it’s a powerful example with lessons for the design of teaching and learning tools, values, and services.