Share Button

The digital devices that will have the biggest impact on your kids may not be the ones you expect. Increasingly, the next generation’s earliest relationships with technology will be shaped by the consumer ‘Internet of Things’ creations we call connected toys, as opposed to mundane and obviously-technological devices such as laptops and smartphones.

Connected toys promise richer and more sophisticated interactions than those possible with the gadgets of previous generations. A modern family home may very well feature everything from small robots who take on the duties of an impossibly low-maintenance pet, to voice assistants who can help with homework well beyond the depth and breadth of even the brainiest parents. Optimistically, these devices even begin to prime kids to the diverse personalities and experiences they will encounter in their lives outside the home.

The reality of these products, however, has often fallen short of their potential. As with other always-online devices, connected toys have lately been illustrating the brittleness of our security standards. Security researchers routinely find that seemingly innocuous devices like app-connected race cars can leave personal data like a child’s name, age, location, and photo very much exposed. The worst offenders even risk having a stranger secretly listen in on or watch a child’s playtime by snooping unprotected live audio and video streams intended for dubiously-crafted services ‘in the cloud’.

The problem isn’t just one of security. Without oversight or external accountability, it is impossible to know where children’s data is being harvested in order to market to them more effectively. The larger question remains: by handing children devices that are opaquely streaming their interactions and personal information to far-off databases on a corporate server somewhere, are we indoctrinating them to constant surveillance?

Ultimately, what we need to be guarding against isn’t some small cabal of hackers looking to steal kids innocence or identity, it’s malicious (and more often than that, negligent) design. There’s no reason that devices would have to work this way, except that they were built to. On a technical level, there is nothing to prevent manufacturers from, say, encrypting the data locally on the device before transmitting it, if it even needs to be transmitted in the first place. Whether for reasons of short-sighted simplicity, or slight differences in development cost, almost universally, they choose not to take these basic precautions.

The manufacturers’ laziness becomes a problem for consumers. Where parents may have the context and digital ‘street-smarts’ to understand when a product isn’t working in their best interests, most kids do not. Parents bear the weight of protecting their children physically and emotionally, and now we’ve added information security too. It doesn’t have to be this way. By making a few decisions with users’ privacy and wellbeing prioritized, manufacturers can just as easily be creating products of a humane and inviting design.

The reactionary way to reign in connected toys is to create laws that limit the exposure of their users—in this case, the screen-addled teens of tomorrow. European regulators are forging ahead with this approach, but we have a model here as well in California’s 2014 SOPIPA bill, which limited the uses of data collected on children in educational settings.

Another approach is through independent certification programs like the Digital Standard project from Consumer Reports. Programs like this offer an opt-in way to start the best-practices conversation, leveraging ideas of good taste and brand cachet towards setting valuable privacy milestones, though those signals may disproportionately reach the affluent and nerdy.

When you get down to it, no precedent has been set about what being ‘good’ looks like in this space. Because of that, the discussion of best practices becomes tricky — assessing new technology offerings requires a constant iteration of asking how expected and unexpected consequences have come from similar products in the past. Where are products exposing kids to dark patterns and manipulation, and where are products opening the door to creative interactions that prime a deliberate relationship with technology?

Transparency will be a key feature of this better, more deliberate relationship between users and devices. The same way security-minded open source software allows public inspection, ethically-designed toys will allow users (and their parents) to understand everything they’re doing, and will clearly lay out what happens to the data they collect. Optimistically, they will also allow users to opt out of any features that could be interpreted as invasive or exploitative, such speech recognition that requires transmitting data to the cloud, where it can be used to train a machine learning model or mined for consumer research without oversight.

The design process for these toys will be an ongoing one. Now that software, rather than hardware, defines key elements of a product experience, iteration on that experience is both practical and valuable. Thoughtful creators will feed their users’ enthusiasm and concerns back into the product roadmap, leading to truly human-centric experience and product design in ways never before possible.

A few newly emerging technologies are giving designers even more avenues to pursue humane design. New classes of low-cost, low-power processors are driving a design model known as “edge compute”. Rather than having “things” creating rich and dynamic experiences by connecting to remote services, these new devices run machine learning programs locally, and only connect to the internet where it truly makes sense to. The result is devices that learn locally, provide rich interactions, and stay secure through regular patching, all without exposing consumers’ data to arcane uses.

The objects next generations will grow up playing with are an important domain for our cultural contentions around technology. As ideas of responsible connectivity, security, and data privacy enter public discourse, the kids being exposed to the frontlines of the discussion will be more deeply impacted by technology than any before them. Ultimately, these connected toys are the products that will set the tone of the relationship going forward. If designed thoughtfully, they can add these very interesting and very powerful modes of interaction into kids’ lives in a way that encourages empowerment, exploration, and wonder. Done carelessly or by the lowest bidder, it ushers in a minefield dystopian landscape of skepticism and malicious intent around which kids will have to tread lightly.

Consumers hold much power over this decision. The dramatic drop in the cost-to-compute means that it is design, not technical specs, that separates good products from bad. The new generation of consumer understands this, and selects for it. Smart manufacturers will respond by focusing on bringing concern for security, usability, and transparency into the forefront. Those that don’t, do so at their peril—aside from the threat of regulation, consumers will vote with their wallets.

Gustavo Huber

IOT Practice Lead

Los Angeles

+1 844 946 SVSG ex. 719

Gustavo Huber is an experienced hardware product architect with over 1 million devices shipped under his belt. He has led lean R&D teams to design and deliver solutions ranging from Industrial IOT and Edge Computing for Fortune 1000 companies to educational toys and open source single-board computers for mass market. His prior projects include smart cities infrastructure, agricultural sensor networks, consumer products, and custom devices for large-scale marquee events.

Find Out How Gustavo & SVSG CTOs
Can Empower Your Business

Share Button