Children are emotional. Protecting them matters. When it comes to technology, do you want developers you don’t know over whom you have no control watching what your children do on their devices?
Apple recently cut developers off from using MDM software to drive third-party parental control solutions.
Developers were upset, and seventeen smaller developers you’ve probably never heard of got together just days before Apple’s WWDC 2019 conference with a well-organized PR campaign and a professional website to demand access to new API’s that let them develop parental control software for iOS.
One of these developers, Kidslox, previously filed an EU complaint against Apple’s move.
The issue is that in order to make these controls work developers had been making use of Apple’s Mobile Device Management (MDM) remote access tools.
The problem with that is that those tools – designed to help enterprise users manage fleets of devices – gave app developers power to tamper with apps and access user information, and Apple didn’t think it was appropriate, it claims.
Who watches the watchmen?
The argument is that not only does no rational parent want unregulated and unaccountable third-party entities accessing data about their kids, but lack of regulation means there’s little control over what happens to that data once it is collected.
This is why Apple withdrew those parental control apps from its App Store.
“We recently removed several parental-control apps from the App Store, and we did it for a simple reason: They put users’ privacy and security at risk. It’s important to understand why and how this happened,” Apple wrote, explaining the move.
Bad Apple, no
Now the developers are campaigning for Apple to create a set of parental control APIs they can use in their apps.
“Apple should release a public API granting developers access to the same functionalities that Apple’s native “Screen Time” uses,” they claim.
While the developers make what seems on the face of it a good case, I do feel their case would be boosted if as well as making a proposal around an API, they could also commit to minimum privacy standards and make public account of:
- What data they collect, and why.
- How it is used.
- How it is protected.
- How parents can manage their kid’s data.
- How data can be deleted.
- If such data is resold.
Always ask why
I didn’t have enough time to investigate every company involved in the campaign, but one quick Google search one one of the 17 firms revealed claims (which I’ve not corroborated though the source seems solid) that 281GB of images and videos of children that company monitored were leaked online.
I guess there is a benign explanation for the company’s need to store all those images and videos of other people’s children, but if I were a well-resourced journalist at any of the major news outlets who are reporting the current campaign, I’d have to ask why that data was retained.
It’s not the only time in which third-party parental control apps seem to have demonstrated shaky security.
To be fair, I’m not saying and I do not know if all such software developers gather data of this kind, but it certainly seems reasonable to insist that companies who have access to data about your children are utterly transparent about how they protect that data and what they do with it.
And the best way to secure such information is not to let others have it in the first place.
Perhaps Apple is right
I always try to support developers, but my tiny investigation makes me hesitate to jump enthusiastically aboard the latest “Apple is bad” meme.
It is worth noting the current campaign to cast Apple as being anti-competitive comes at a bad time.
The company’s record on privacy has come under fairly consistent attack since it dared to point out its platform is inherently far more secure than the dominant mobile OS.
The current campaign may turn out to be a little counterproductive in terms of furthering the ongoing ‘Apple is bad’ narrative. After all, if a two-minute web search exposes instances in which the security of such software has been compromised, this rather supports Apple’s inherent argument of needing to switch off access to MDM software in order to protect its customers.
Yes and no
I didn’t have time to investigate every company. I’m not saying they are all subject to such problems, nor that it is likely they are. I’d even concede that the campaign does signify a need for APIs developers can use in order to deliver more personalised iOS parental protection features than Apple makes available.
There are as many usage cases as there are parents and children.
At the same time, I’d urge anyone using parental control software on any platform to read the developer’s privacy and security policies thoroughly.
It’s also important to ensure developers can be held to account in the event they let down your trust, and that you as a parent have complete access to any data held about your children by those developers.
In the event you do not have such control, I’d recommend not using the solution.
This, surely, is common sense?