If you are a parent like me, you probably get pangs of guilt every time you turn the TV on in front of the kids, or give them the iPad or an iPhone. Our pediatrician told us that children should get no TV before they turn 2, and in very small doses even after. In general we adhere to it, but the guilt doesn't go away, does it?
The reality is that our kids are not the first generation of children being exposed to screen time. Most of us grew up with TVs, and a lot of us with computers as well. Children now, though, have a lot more screens in their lives with the advent of iPhones and iPads.
It is important to first recognize that not all screen time is created equal -- watching a movie or a video on a TV is a very different use of screen than playing a game, which happens to be very different than programming a computer or making a digital painting, and all of these activities require the use of a screen.
One interesting development in the recent years has been around the world of connected play -- where physical products connect smart devices, often iPhones or iPads, and give kids a new way to play. How do you look at screen time in that world, where children are interacting with a physical object, while still using a computer with a screen?
Interestingly enough, not all screen time is created equal even in the world of connected play. There are three kinds of connected play experiences that we see being created:
- Tangible Manipulation, e.g., Tiggly, Osmo, Bridging Book, ubooly
This is the space that has seen probably the most number of inventions. As kids play with the physical toy, the screen interacts with them to extend the play experience. Tiggly app detects the pieces you put on the screen to determine the right response. With Osmo, your iPad tells you when you have the right answer based on detecting the position of your physical pieces. They all extend the on-screen play to the physical world.
For these products, the screen still plays a key role in the interaction -- it just happens to be that the physical manipulation may not involve your hands on the screen as much.
Not only in toys, but in every aspect of our lives, iPhones have transformed how we control and interact with the world around us. Ranging from controlling our home to improving our fitness, the iPhone has heralded a new world of innovative products.
It is just as true for toys. It is inevitable for the smartphone to become the default remote controller for RC toys. Traditional toy companies may still lag behind, but new companies in this space are pushing the boundaries. There are robotic toys, such as Wowwee and Sphero, that you can control and manipulate using an iPhone. Anki Drive is potentially going to replace slot cars. You can use an app on your phone to feed and interact with the Furby.
A different way of looking at the screen is to treat the device as a computer with a new way of manipulating data, that is both tangible and kid-friendly. This is how Wonder Workshop looks at touch devices.
When you use your iPad or iPhone to program the robots, you get to use the device as a creative tool and not just a screen: it becomes a tool to manipulate the physical world, and invent your own toys. You use the screen to manipulate data and create programs -- and the interface is designed to take advantage of the touch screens. Eventually, the role of the screen ends -- when you are done programming the robot, and you can wirelessly save the programs on the robots, and put the screen away. Your robots remembers your code, and is ready for play.
The truth is that innovations like Wonder could only happen because of the invention of iPhones and iPads first. These devices bring a child-friendly touch interface, seamless wireless communication with robots using Bluetooth LE, and enough computing power to run complex robotics programs -- all critical for Wonder and the robots Dash and Dot to be able to exist.
Our children are going to be digital natives -- growing up in a world they haven't seen without the Internet or without having access to it from their fingertips. How do we then set them up with ways to discover, create, and learn with these screens rather than simply consume media?