We ask a lot of ourselves as children. Somehow we must go from sensory lumps to mobile, rational, attentive communicators in just a few years. Here you are, a child with no vocabulary, in a room cluttered with toys and stuffed animals. You pick up a Lincoln Log and your keeper says, “This is a 'log'.” You finally understand that “log” does not strictly refer to this particular brown plastic cylinder or to brown plastic cylinders in general, but to brown plastic cylinders that embody the characteristics of the parts of felled and denuded trees, which are also, of course , “trunks”.
There has been a lot of research and heated debate about how children achieve this. Some scientists have argued that most of our language acquisition can be explained by associative learning, since we relate sounds to the senses, just as dogs associate the ringing of a bell with food. Others argue that there are features inherent in the human mind that have shaped the forms of all languages and that are crucial to our learning. Still others argue that children build understanding of new words in addition to understanding other words.
This discourse advanced on a recent Sunday morning, as Tammy Kwan and Brenden Lake delivered blackberries from a bowl into the mouth of their twenty-one-month-old daughter, Luna. Luna was dressed in pink leggings and a pink tutu, with a silicone bib around her neck and a soft pink hat on her head. A lightweight GoPro-type camera has been attached to the front.
“Babooga,” he said, pointing a round finger at the berries. Dr. Kwan gave her change and Dr. Lake looked at the empty bowl with amusement. “It's about 10 dollars,” he said. A light on the camera flashed.
For an hour every week for the past 11 months, Dr. Lake, a psychologist at New York University whose research focuses on human and artificial intelligence, attached a camera to Luna and recorded things from her point of view as she plays. Her goal is to use videos to train a language model using the same sensory input a child is exposed to: a LunaBot, so to speak. In doing so, she hopes to create better tools for understanding both artificial intelligence and ourselves. “We believe this research finally establishes that connection between these two areas of study,” Dr. Lake said. “You can finally put them in dialogue with each other.”