Human Rights by Design: lessons from Crypto.Cat

Frank Smyth at the Committee to Protect Journalists writes about a chat tool, Cryptocat, developed with cryptography protections to give Ease of Use & Strong Protections to those who needed to chat online.
The idea was to give tools to non-expert journalists & human rights defenders, so they can use online tools with greater digital safety & without much training or consciousness.
After some initial fanfare of the New York Times, Forbes, and Wired, more investigation found that Cryptocat made users vulnerable to attack. Smyth does not rub this failure into the makers’ faces — rather he focuses on the wicked problem of how to get these super-vulnerable users, who are also particularly naive about tech, protected with great ease of use.

He writes:
“So we are back to where we started, to a degree. Other, older digital safety tools are “a little harder to use, but their security is real,” Ball added in Wired. Yet, in the real world, fromMexico toEthiopia, fromSyria toBahrain, how many human rights activists, journalists, and others actually use them? “The tools are just too hard to learn. They take too long to learn. And no one’s going to learn them,” a journalist for a major U.S. news organization recently told me.Who will help bridge the gap? Information-freedom technologists clearly don’t build free, open-source tools to get rich. They’re motivated by the recognition one gets from building an exciting, important new tool. (Kind of like journalists breaking a story.) Training people in the use of security tools or making those tools easier to use doesn’t bring the same sort of credit.Or financial support. Donors—in good part,U.S. government agencies—tend to back the development of new tools rather than ongoing usability training and development. But in doing so, technologists and donors are avoiding a crucial question: Why aren’t more people using security tools? These days—20 years into what we now know as the Internet—usability testing is key to every successful commercial online venture. Yet it is rarely practiced in the Internet freedom community.That may be changing. The anti-censorship circumvention tool Tor has grown progressively easier to use, and donors and technologists are now working to make iteasier and faster still. Other tools, likePretty Good Privacy or its slightly improved free alternativeGnuPG, still seem needlessly difficult to operate. Partly because the emphasis is on open technology built by volunteers, users are rarely if ever redirected how to get back on track if they make a mistake or reach a dead end. This would be nearly inconceivable today with any commercial application designed to help users purchase a service or product.Which brings us back to Cryptocat, the ever-so-easy tool that was not as secure as it was once thought to be. For a time, the online debate among technologists degenerated into thekind of vitriol one might expect to hear among, say, U.S. presidential campaigns. But wounds have since healed and some critics are now working with Kobeissi to help clean up and secure Cryptocat.Life and death, prison and torture remainreal outcomes for many users, and, as Ball noted in Wired, there are no security shortcuts in hostile environments. But if tools remain too difficult for people to use in real-life circumstances in which they are under duress, then that is a security problem in itself.The lesson of Cryptocat is that more learning and collaboration are needed. Donors, journalists, and technologists can work together more closely to bridge the gap between invention and use.”