Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on November 26, 2013

It’s time that we knew more about what startups are doing with our data


It’s time that we knew more about what startups are doing with our data

Earlier today we reported on an apparent security problem with trivia app QuizUp that saw users’ contact data sent in plain text. Developer Plain Vanilla blamed a fault with third-party software that has now apparently been fixed, but the episode highlighted an important issue – we don’t know enough about how well startups protect our data.

We’re giving increasing amounts of personal data to teams that are often small, inexperienced and making their company up as they go along. From a business point of view, these attributes can be an advantage but from a security point of view… well, we just don’t know. We’re handing over our data entirely on trust that it’s being looked after properly.

A quick look at my phone reveals multiple apps from startups that have a detailed record of my location history, a copy of my photographs, access to my email accounts, a record of my physical activity data… the list goes on. How well is that information being protected on those startups’ servers? I have no idea.

The recent attack on Buffer, where it was revealed that the company wasn’t doing everything possible to protect itself (and users) is a recent example of a company with good intentions that didn’t keep security as tight as it could be.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The recent promo video for credit card replacement device Coin made me chuckle, as security was literally an afterthought in the sales pitch. “Oh, security,” it says towards the end of the video, and even then it’s the security of the device, not your data, that’s discussed. Not to pick on Coin (this was an ad for a product that’s still in development after all), but it really does seem to sum up the attitude to a lot of the industry at the moment.

A solution? User data audits

So, what’s the solution? One that I’ve been mulling for a couple of weeks now is the idea of a data security audit program for startups. Any startup that met a suitable standard of security could display a certification logo on its website, within its app or wherever it felt appropriate.

The audits would have to be affordable for startups, and to ensure they were run in users’ best interests, I’d suggest that a non-profit trust be set up to operate them. The trust would be overseen by a board made up of a diverse range of industry heavyweights, and audits would be offered at-cost as a way of ensuring that public trust in the tech industry remains solid and security awareness remains high.

The BSI Kitemark is a recognized symbol of quality in the UK.
The BSI Kitemark is a recognized symbol of quality on products in the UK. Audited startups could use a logo in a similar fashion.

In the past week, I’ve put the concept of startup security audits to a number of tech entrepreneurs who are in the business of collecting user data. All were supportive of the idea, although most preferred not to be quoted about it on the record. In some cases, they preferred to stay publicly silent on the issue was because they didn’t want to draw attention to their own current security setups. However, it was reassuring to hear that some had contracted independent consultants to check their data security arrangements were up to scratch.

Martin Källström, CEO of Narrative, a startup that will be storing an image from every 30 seconds of its users’ lives when its much-delayed wearable camera launches, was particularly positive: “Only speaking for my own company, this would be very interesting and something we would love to be part of. We already have regular audits planned from computer security experts, but that is only one facet of user data. Of course since we are investing a lot in doing things right, we are looking for ways to make that clear to the outside world. A third-party auditor with good reputation would be great.”

Källström suggested that the ‘certification’ could be “modularized much like (Creative Commons) licensing is, and neutral in its communication about in what ways a company is working to protect user integrity, without being damning to companies that are doing at least some things right.”

Security at scale

This isn’t a flawless idea. Scaling the audits to accommodate demand would be a problem, as would ensuring that they were affordable while also being carried out by people with the skills to do them properly.

Then there’s the role of big companies. Adobe and Sony aren’t startups and they’ve both suffered big user information losses in the past couple of years. Perhaps, companies of that size could participate too, and pay significantly more into the system to help pay for small startups.

There’s also the risk that displaying a security audit certification to users would encourage more sophisticated attacks from hackers. Maybe ‘security through obscurity’ isn’t such a bad thing.

Either way, we (users, the industry and the media) need to have more of a conversation about data security, because as time goes on, increasingly personal data will be available in more places, to more people than ever before. Right now, we have no way of knowing who to trust, and how much we should trust them.

Image credit: luchunyu / Shutterstock

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Published
Back to top