German authorities urged owners of a talking doll called Cayla to destroy the toy due to fears that hackers could use Bluetooth to access personal data © Getty Images

As Christmas approaches, internet-enabled smart toys are likely to feature heavily under festive trees. While some dolls of decades past were only capable of speaking pre-recorded phrases, modern equivalents boast speech recognition and can search for answers online in real time.

Other connected gadgets include drones or cars such as Nintendo’s Mario Kart Live Home Circuit, where players race each other in a virtual world modelled after their home surroundings.

But for all the fun that such items can bring, there is a risk — poorly-secured Internet of Things toys can be turned into convenient tools for hackers.

Insecure smart toys pose several types of hazards, according to Greg Day, vice-president and chief security officer for Emea at security company Palo Alto Networks. “As a parent, the one I would be most protective of is spying on my children”, specifically security weaknesses for video and voice-enabled baby monitors.

Similar concerns were raised in 2017 by Bundesnetzagentur, Germany’s telecoms watchdog, over fears that hackers could use an unsecured Bluetooth device to access personal data from a talking doll called Cayla. The watchdog urged parents to destroy the dolls.

For older children, smart toys can pose a different set of risks around content. “[With products such as] digital cameras that you click a button and it streams to YouTube, Instagram or TikTok . . . you’re going to be saying: ‘Are my children accessing content that’s appropriate . . . and most importantly is who [else] has access to that content?’,” says Mr Day.

Nintendo’s Mario Kart Live Home Circuit: where players race each other in a virtual world modelled after their home surroundings © Nintendo

When consumer group Which? worked with Accenture Security Context, a unit of the consultancy, they found vulnerabilities in dolls that could allow outside users to control the images and audio files that they played.

However, these devices do not just pose risks to children — poorly secured toys can become bridges to target other devices on the same network. Devices such as smart TVs, speakers, phones, laptops can all be at risk, says Mr Day. “It’s a possible method of getting your banking credentials . . . or you and your children’s ID credentials.”

And smart toys are not just for children, adds Mr Day, with about one-third of workplaces containing connected teddy bears, robots and other IoT products. “You’re never too old to be a big kid,” he says — but this only increases the potential targets for hackers.

The most common problems are insecure configurations, a lack of two-factor authentication and poor encryption, says Charles Henderson, global head of IBM’s X-Force Red unit, which tests device security.

“Smart toys come with some cool features — they can pull updates from the cloud to a toy that says things . . . and make it more dynamic, but they just need to do it safely,” he says, likening it to the need to ensure toys are non-toxic.

While he says there are an increasing number of consumer groups fighting to make toys safer, in many ways the industry remains a “Wild West”, with limited clarity on what is a safe smart toy. For parents keen to get their children a high-tech toy, he recommends buying brands that they trust rather than the most affordable. “Don’t get ‘new’ old stock,” he adds — referring to products that have languished on shelves — because their security features are probably outdated.

“Does it have multi-factor authentication [eg requiring users to identify themselves as trusted using a secondary device]? What other security features are in place?” asks Mr Henderson.

Regulators are beginning to move against the problem, with the UK government among those taking a more proactive stance. “The UK is doing all the right things,” says Katie Vickery, a partner at international legal practice Osborne Clarke. “It’s been on the front foot, engaging with industry.”

The Information Commissioner’s Office’s code of practice on age appropriate design, which came into force in September, deals with connected toys and advises producers to take a number of steps to secure them. These include providing information about the use of personal data and being clear about who is processing it.

However, Ms Vickery says that none of these steps are legally binding and any legislation is some way off. “We’ve got a technical standard . . . for businesses [producing smart toys], but there is now a lot of discussion around whether it should be mandatory,” she says, adding that suggestions such as clear labelling on secure products would be a valuable step.

“I think that there is an expectation that these devices are secure [for consumers],” says Ms Vickery. “There’s a real opportunity for businesses . . . to design their products [well] and make a virtue out of it.”

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments