November 2009
By Shelly Orr Priebe

Shelly Orr Priebe’s 22 year tenure at McElroy Translation bridged multiple eras in the localization industry evolution. As McElroy’s President for the last decade she directed a quantum leap in a technology infrastructure that optimizes crowdsourcing. She remains simultaneously passionate about the valuable role of expert humans in redefined workflows. Shelly is currently consulting and exploring new opportunities in progressive environments.

sop(at)austin.rr.com

Tom Sawyer – A crowdsourcing pioneer?

Most American schoolchildren are familiar with The Adventures of Tom Sawyer, (Mark Twain, 1876) and, thanks to translators, many students around the world have also read Twain’s classic. In the book’s most famous scene, the protagonist Tom is assigned the task of whitewashing a fence. While his motivation is to avoid work rather than cost, he cleverly manipulates his friends and acquaintances into doing the work for him. Not only does he leverage effective “non-financial rewards,” but he even gets others to compensate him for the “privilege” of contributing to the effort. Could this be the first recorded instance of crowdsourcing?

If so, it is missing a key component of the overall process: quality control. Twain is silent on that subject, but the translation and localization industry need not be. In fact, the industry needs to challenge perceived “truths” about crowdsourcing. The controversial topic of localization crowdsourcing needs a good whitewash.

Fast-forward. How is crowdsourcing playing out in the language services space? As with the original story, there are plenty of people with sufficient language expertise to volunteer for translation work, especially when the “non-financial rewards” are attractive. There are certainly situations where this approach can be appropriate and effective. The oft-cited Facebook experience is a prime example. In the Facebook translation scenario the volunteers/users themselves may actually benefit in non-tangible ways by expanding the reach of the social networking site to others in their language of interest. Facebook localized its site to 65+ languages while building loyalty, accelerating viral marketing, and respecting the power of the market itself to determine correct terminology and phraseology.

Why then is crowdsourcing a controversial topic in industry circles? Is the furor among professional translators warranted? It certainly has been a topic of discussion. Take LinkedIn’s attempt to replicate the Facebook experience that touched off an uproar among professional translators. Operating in a more professional realm, LinkedIn was accused of trying to take advantage of volunteer recruits.

The case of 'The Lost Symbol'

Translation agency bloggers also decried the decision of the publisher of The 'Lost Symbol', Dan Brown’s latest thriller to translate Brown’s new release into Swedish by a “crowd” of six in seven days. Understandably the translator/service provider mentality has been skewed by years of bad business decisions by executives who often have not understood or valued language nuance or the marketing power of well-localized communications. But can we allow that the publisher made a business decision in which the factors of time to market and potential for security leaks were heavily weighted?

The rant of the blogging community assumes that this example of crowdsourcing is happening in a vacuum with no management of process. In fact seven translators were selected. Perhaps that selection was based on similar styles and/or previous collaboration. Perhaps style and term guides were carefully prepared and strictly adhered to. Perhaps the translators worked in a server environment with translation tools allowing them to communicate with one another and refresh their work instantaneously, deriving benefit from their collaborators. And perhaps in that structured translation environment a highly-skilled lead translator guided the others. Most importantly, maybe satisfying the desire of the Swedish market for near simultaneous release was paramount to a decision that trumped quality.

Professional crowdsourcing

New paradigms are typically received with coolness, suspicion, and hostility. The suspicion surrounding the concept of localization crowdsourcing is quite similar to the wariness of translation memory (TM) tools when they were first introduced. The comparison is poignant. Translators, service providers, and clients now take the advantages of TMs for granted. Who today would sacrifice quality or throughput by not leveraging TM? TMs are even shared now through the Translation Automation User Society (TAUS). Is it possible for language service providers who embrace crowdsourcing to apply best business practices, which improve quality and throughput because they harness the crowd?

“Professional Crowdsourcing” can be a strategic way to manage workflows that is not radically different from the original management of translation teams. Its potential is built on the foundation of new and improved technical architecture combined with a readiness and inclination to collaborate that typify the “Net Gen”. More and more freelancers will see this as just another method of working in the digital age. Resistance to crowdsourcing will continue to erode as this workflow brings localized content to grateful recipients and enables access to new content.

If you have any doubts about this “crowdsourcing fad” look outside the industry at the range of successful and fascinating crowd-aided enterprises:

  • Social networking: Facebook
  • Encyclopedia: Wikipedia
  • Operating System: Linux
  • Venture Capital: Common Angels
  • Mutual Fund: Marketocracy.com
  • Designer T-shirts: Threadless.com
  • TV Advertisement: Doritos ad for Super Bowl XLI
  • Corporate R&D: Procter & Gamble
  • Scientific Network for Global R&D: Innocentive by Eli Lilly
  • Online Customer Reviews: Amazon
  • Gene Sequencing: Human Genome Project
  • Application Software: Sugar CRM and tens of thousands more

Next read the very thorough research report recently published by LISA, 'Crowdsourcing: The Crowd Wants to Help You Reach New Markets', which provides a very broad and deep analysis of the phenomenon. Among other insights, the report asserts that crowdsourcing is being leveraged extensively by high-tech firms and, importantly, not as a cost reduction option. In the report, Wayne Bourland, Senior Manager of Dell Computer’s Global Localization team is quoted: “What we want eventually from our service provider(s) is a combination of localization, machine translation, and crowdsourcing services.”

Debunking the myths

Now that you see that crowdsourcing is here to stay, let’s debunk the myths. As the president of McElroy Translation, I led the 40-year-old language service provider through a technology facelift to empower crowdsourcing. It was a fascinating challenge that entailed combining advanced technology that facilitated new internal workflows, all while enlisting the support of the translator community. McElroy is on a positive track as an industry leader in this field with “live” crowdsourcing already deployed, and the capability to expand this workflow for other types of client work. Pioneering professional crowdsourcing at McElroy gave me first-hand knowledge of the challenges…and of the many myths.

Myth #1: Crowdsourcing is unpaid work
It is in the interest of translation-industry leadership to dispel this misconception. As leaders harness productivity in new ways they are compelled to think differently about how to compete and be profitable. Managed crowdsourcing is designed to increase throughput but it can only be effective in a commercial environment when translators are valued and respected. Engaging, recruiting, motivating, and rewarding translators in this brave new world is a worthy challenge.

The nature of linguistic knowledge is that vocabulary tends to be shared in a very predictable fashion. Apart from some technical jargon (which is, of course, important in translation) most people with a working vocabulary of any given size know almost exactly the same words. This means that only the highly-skilled translators will be able to handle more difficult aspects of any particular translation, and that may still require time-consuming research. Meanwhile, there may be many in the “crowd” who can handle relatively large volumes of “routine” translation. The industry’s current per-word compensation model essentially incents the best translators to spend time on the easiest tasks, because that is where they can generate more volume.

Dr. Mark Ritter, previously chief language officer at McElroy Translation and the deputy chair of the German-to-English Certification Committee for the ATA, estimates that the famous 80/20 rule may actually be too optimistic with respect to patent translator productivity. In fact, as much as 90 percent of translator time may be spent on the most challenging ten percent of the translation. To legitimize professionally managed crowdsourcing, the industry is compelled to design compensation models that incent the most highly qualified translators to attack all and only the most challenging portions of the translation, while leveraging their expertise (not merely individual translated segments) in order to allow less experienced translators to deliver the routine portions of the work in a collaborative fashion.

Jean-Luc Mazet, localization program manager and information developer for Hewlett-Packard, offered this insight on the changes: “In a professional or business setting, when moving into managed crowdsourcing, we harness the ‘power of the crowd’ and as a result we may be creating a cyclic movement that will lead us back to the ‘original’ contractor model, with one main benefit or gain: The openness and broadening of the bidding process. We ‘just’ changed the way we ALL do business.” Furthermore, Jost Zetzsche, language technology guru, recently challenged the industry when he wrote, “Would (crowdsourced) projects like this benefit from professional experience in areas like translation techniques, terminology management, translation memory maintenance? Of course they would! And it’s up to us to offer it in a palatable way (and ‘palatable’ is not a synonym for ‘free’ or ‘cheap’)”.

Myth #2: Crowdsourcing quality is unassured
The role of the language service provider (LSP) becomes even more important in a professional crowdsourcing environment. The power of the (well-compensated) crowd will drive improvements in throughput and allow for improved quality. It will still be up to the LSP to monitor the process, and ensure that the right level of terminology standardization and target language quality is maintained, all while addressing issues such as confidentiality and source control. The bottom line is that the progressive LSP will add more value while at the same time reducing or eliminating the repetitive tasks associated with project administration.

Tools of the trade

So what tools are available, or are being developed, to facilitate collaborative communities?

Welocalize has been developing its GlobalSight and Crowdsight open source systems. The Rosetta Foundation, which has a goal of providing localization services to economically less developed regions and societies, is using the open source systems already. GlobalSight helps automate the critical tasks associated with the creation, translation, review, storage and management of global content, while CrowdSight, which is fully integrated with GlobalSight, is used specifically to engage the right “crowd,” group or community to deliver quick-turn translation for on-demand content. The Rosetta Foundation is working with organizations like Translators without Borders (TWB) to put these tools to work. TWB has translated about a million free words per year for many non-governmental organizations around the world. TWB co-founder Lori Thicke explains that “Our translation projects are typically crowdsourced in the sense that we usually don't ask volunteers to contribute more than ten pages of translation, and usually much less, and about half of our projects can run anywhere from ten to 200 pages. We will be using CrowdSight soon, thanks to the Rosetta Foundation."

Meanwhile, McElroy built a crowdsourcing system with best-of-breed technologies that interface seamlessly. For very fast turns involving short sections to be translated (e.g. user-generated content on a review website), McElroy successfully uses a lightweight browser based Translation Environment Tool (TEnT) that completely integrates with the McElroyHUB™, a vendor management tool that automates project assignments. For more complex projects that would benefit from a full-featured TEnT and/or TM or machine translation, McElroy chose Kilgray’s MemoQ. A seamless integration between MemoQ and the McElroyHUB™ is being finalized. It will allow translators to accept a job on the McElroyHUB™ and immediately gain access to the MemoQ project on the server. Teams working in this technical environment can support each other with shared terminology, TMs and project specific chat rooms in real time. Thoughtfully managed crowdsourcing can increase access to the right teams for the right projects.

Was Tom Sawyer the original crowdsourcing pioneer? It is a clever analogy proposed by McElroy’s former vice president of strategy Bob Donaldson. It is absurd to think that somehow the localization industry is immune to the societal influences and market demands which support crowdsourcing. The challenge of our industry is to assert a definition of crowdsourcing as a managed and professional process, which best meets the needs of translators and clients.