May 1, 2024

Grok tries to help me analyze the "ethicality" of attaching a camera to your baby's head and deviously distracts me with the question of gluing hair onto the head of one's 3-year-old.

Here's my screen shot (and I'll tell you in a minute why I was asking this question):

 
I don't know why Jim Gaffigan had his question, but I had my question because I was reading the NYT article "From Baby Talk to Baby A.I./Could a better understanding of how infants acquire language help us build smarter A.I. models?" 

We read about a 20-month-old girl who wears "a soft pink hat" with a "lightweight GoPro-type camera... attached to the front." This particular child is only wearing the camera once a week for one month, but scientists are asking...
... interlocking questions — How humanlike can we make A.I.? What makes us human?... To pursue the former question piece by piece, by modeling social interactions, intentions and biases, by collecting comprehensive video footage from a headcam mounted on a one-year-old, is to move closer to answering the latter.

“If the field can get to the place where models are trained on nothing but the data that a single child saw, and they do well on a huge set of tasks, that would be a huge scientific achievement,” Dr. Lake said.

I didn't see anything in the article questioning treating a baby this way. The child is presented as perfectly adorable and happy. But what happens next? Will parents be paid to put their baby through this data collection process for the benefit of the A.I. industry? Can the baby say no? 

Ironically, the article ends with some lighthearted fun about a baby saying no:

[The mother] said, “There was a time when she didn’t know the word ‘no,’ and it was just ‘yes’ to everything.” She addressed Luna: “Kisses, do you want kisses?”

“No,” Luna said.

“Oh,” [the father] Dr. Lake said, laughing. “I do miss the ‘yes’ phase.”

Can we infer the consent of the baby? Would you want your one and only babyhood to be used as a science experiment or commercial data-collection enterprise? Wouldn't it affect how your parents spoke to you and what they encouraged you to do? 

By the way, is it now the rule that parents must obtain express consent — "Kisses, do you want kisses?" — before kissing their baby?

ADDED: I was "deviously distracted" by "the question of gluing hair onto the head of one's 3-year-old" because I assumed the Gaffigan tweet was more relevant to the question I'd asked than it was. As a commenter pointed out, Gaffigan was joking about stealing the baby's hair to glue onto his own (bald) head. The question I'd asked Grok was about attaching a camera to a baby's head, so I rashly assumed I'd been served up another question of attaching something to a baby's head. No. That wasn't it. It was another case of a parent messing with the baby's head, but it was about detaching, not attaching. You know, often babies are bald or near bald, and the parent seeks to do something about that, so gluing hair on — as something a foolish parent might do — didn't seem that farfetched.

41 comments:

MadisonMan said...

I would never ask a child under, say, 2 or 3, if I could kiss them. By the time they're 5, yes.

tim maguire said...

I don't see what the issue is. If the camera is on the baby's head, then the baby is not on camera. You're not recording the baby, you're trying to get a sense of what the baby sees.

The only ethical issue I can imagine is if the bulk and weight of the camera interferes with the baby's movement and comfort.

robother said...

Gaffigan is wanting to glue his kid's hair to HIS (presumably balding) head. I don't recall ever being asked for permission by a parent before a haircut. I started declining in the early 60s.

Temujin said...

I find it an insane concept to do to your child. Self-absorbed people would be the ones to try it. Baby focused parents would not.

That's my view on it.

Ann Althouse said...

"I don't see what the issue is. If the camera is on the baby's head, then the baby is not on camera. You're not recording the baby, you're trying to get a sense of what the baby sees."

1. You are recording what the baby points its head toward, that is, the choices the baby makes.

2. You are recording the baby's experience — what it sees and hears.

3. You are recording the parts of its own body the baby looks at.

4. You are recording all the sounds the baby makes — efforts at speech, crying, moaning, farting, shitting.

What the baby's face looks like is an important aspect of the baby's interest in privacy, but why are you prioritizing it to such a degree that you have lost awareness that the baby is being recorded?

Kate said...

Grok only considers the social ethics, not the intrinsic morality of the body. There are ways we interfere with a baby's body -- diapers must be changed -- but even a small child needs to have their physical personhood respected.

Grok sees us as giant bags of mostly water.

Duke Dan said...

Similar baby experiments have been done before

A MAN ONCE TRIED TO RAISE HIS SON AS A NATIVE SPEAKER IN KLINGON

MadTownGuy said...

"Can we infer the consent of the baby? Would you want your one and only babyhood to be used as a science experiment or commercial data-collection enterprise?"

Babies are the subject of many medical and behavioral studies, though you're right about the privacy issue when applied to an individual vs. group anonymity.

Toddlers aren't the subject of gender reassignment yet, but the recommended age for medical and behavioral therapies is <a href="https://apnews.com/article/gender-transition-treatment-guidelines-9dbe54f670a3a0f5f2831c2bf14f9bbb>
creeping down.</a>

wildswan said...

The camera will interfere with how everyone responds to the baby just at the time when the baby is learning how to respond to people. Instead of an interaction where people are playful and unselfconscious and the baby responds to that, people will likely be stiff and formal or at least self-conscious and on display. Showing off for the camera. That will harm the baby. It's a terrible idea.

Breezy said...

The only reason it’s done with babies is because they can’t say no. It crosses the acceptability line imho.

Howard said...

Invokes the Heisenberg uncertainty principle. The real issue is by mounting a camera on the baby's head ultimately you're going to be changing the baby's behavior. That's what's wrong with it.

Aggie said...

Bring back the Skinner box !

lamech said...

“Babies like all individuals, have a right to privacy, and recording their every move and word without their consent could be considered a violation of that right”

I consider myself a reasonably accomplished and intelligent person (in my late 40s), yet dealing with smartphones, apps and other modern technologies (e.g. telemetrics in cars), I often feel akin to a baby, practically unable to (seemingly often thwarted from being able to) manage often-changing tools and policies for consent and settings.
Under many circumstances involving data collecting technologies, I do not see that settings management tools or click through approvals can/should be valid sources to infer actual consent.
This present view is informed/driven by spending over 30 minutes yesterday trying to change a setting on an iPhone. Apple’s official help sources were terrible. Eventually, a YouTube video provided the necessary information.

Bob Boyd said...

I do miss the ‘yes’ phase.

We're all going to be saying that about AI soon.

The AI industry wants to train AI with a camera on a baby so it can do a better job spying on and controlling the development of all of us including the baby.

They want to train AI to eventually train all babies.

rhhardin said...

Kids learn language by learning to disassemble and reassemble cliches.

Bob Boyd said...

All in all, it's just another brick in the wall

Rafe said...

“Invokes the Heisenberg uncertainty principle. The real issue is by mounting a camera on the baby's head ultimately you're going to be changing the baby's behavior. That's what's wrong with it.”

That’s…not what the Heisenberg Uncertainty Principle is, dummy.

You’re taking about the “observer effect.”

- Rafe

Rusty said...
This comment has been removed by the author.
rehajm said...

AI wants you to look at comedians with high social credit scores so it pulled one out of the hopper that hit your keywords. They’re gonna hit you with his liberal rant later when it is important…

AI is too stupid to take over bit is not your friend…

RigelDog said...

Yes, it crosses a line---many lines, actually.

NB: It figures that the parents who consented to having their child be part of a social experiment have named their daughter Luna.

Yancey Ward said...

Yes, it is unethical. It will be a nuisance for the little girl that provides her no benefit and, even worse, there doesn't appear to be any information that couldn't have been acquired with a camera not attached to the child.

Iman said...

Anything that Jim Gaffigan may do that will land his fat ass in a prison cell should not only be permitted but encouraged, as well.

John henry said...

What do you do if you find the baby is watching porn when you are not around?

Or worse, trump rallies

John Henry

Iman said...

“The real issue is by mounting a camera on the baby's head ultimately you're going to be changing the baby's behavior. That's what's wrong with it.”

Quick, mount a camera to Howard’s cashew-shaped melon! Think of the possibilities… some of them positive.

John henry said...

I don't see an ethical issue with the camera. Kind of a cool idea actually.

The problem would be what is done with the data.

I do see, potentially, a physical problem as someone else mentioned. What about weight and restrictions on movement on a weak and fragile young body?

John Henry

tim maguire said...

Ann Althouse said...
"I don't see what the issue is. If the camera is on the baby's head, then the baby is not on camera. You're not recording the baby, you're trying to get a sense of what the baby sees."

1. You are recording what the baby points its head toward, that is, the choices the baby makes.

2. You are recording the baby's experience — what it sees and hears.

3. You are recording the parts of its own body the baby looks at.

4. You are recording all the sounds the baby makes — efforts at speech, crying, moaning, farting, shitting.

What the baby's face looks like is an important aspect of the baby's interest in privacy, but why are you prioritizing it to such a degree that you have lost awareness that the baby is being recorded?


Because the baby isn't being recorded. No.'s 1&2 are so obviously non-issues, I'm surprised you included them at all (and you included them first!) No.'s 3&4 are anonymous unless the baby is looking at their very distinctive birthmark. Again, a non-issue.

This is far less invasive than recordings that have been regularly made by parents for generations. Do you have pictures of your child as a baby? Do you have video of your child as a baby? That is much worse than what we are talking about here.

Leland said...

I would think we have millennia of experience training children to learn languages even ones as complex as English. One would think software programmers would also have experience learning languages. Why exactly do we need to develop a science of observing what a child observes to train AI to do something we should already know how to train AI to do?

Jimmy said...

Brave new world indeed. Can we experiment by invading an individuals personal space. next up, let's implant a chip, and follow every second of a persons existence.
Bad enough we used beagles to experiment on. Oh wait, the government experimented on black men, white men, with drugs, diseases, and sanity. That was fun
Have we figured out the government wants to do anything it feels like, for our own good?
Now parents, probably the same ones who decide that little billy at 5 is really little jane, can use their kids as lab rats.
Didn't one of the founders of america say that all of this Republic stuff rests on an informed, moral, ethical citizen? oh boy, are we in trouble.

Jupiter said...

"Grok only considers the social ethics, not the intrinsic morality of the body."

Just stop right there. Grok does not "consider" anything. Grok is a computer program, which is designed to construct an ordered set of words that resembles, in certain fairly arbitrary ways, numerous other ordered wordsets it has encountered in the past.

Howard said...

It's a metaphor rapped in an iguana after drinking ripple.

The act of taking a measurement disrupts what you are measuring. Like how trial outcomes are affected by cameras in the courtroom.

Heisenberg's Uncertainty Principle states that there is inherent uncertainty in the act of measuring a variable of a particle. Commonly applied to the position and momentum of a particle, the principle states that the more precisely the position is known the more uncertain the momentum is and vice versa.

He imagines an experimenter trying to measure the position and momentum of an electron by shooting a photon at it.[94]: 49–50 

Problem 1 – If the photon has a short wavelength, and therefore, a large momentum, the position can be measured accurately. But the photon scatters in a random direction, transferring a large and uncertain amount of momentum to the electron. If the photon has a long wavelength and low momentum, the collision does not disturb the electron's momentum very much, but the scattering will reveal its position only vaguely.
Problem 2 – If a large aperture is used for the microscope, the electron's location can be well resolved (see Rayleigh criterion); but by the principle of conservation of momentum, the transverse momentum of the incoming photon affects the electron's beamline momentum and hence, the new momentum of the electron resolves poorly. If a small aperture is used, the accuracy of both resolutions is the other way around.

The combination of these trade-offs implies that no matter what photon wavelength and aperture size are used, the product of the uncertainty in measured position and measured momentum is greater than or equal to a lower limit, which is (up to a small numerical factor) equal to the Planck constant.[95] Heisenberg did not care to formulate the uncertainty principle as an exact limit, and preferred to use it instead, as a heuristic quantitative statement, correct up to small numerical factors, which makes the radically new noncommutativity of quantum mechanics inevitable.

Jupiter said...

And Althouse, you've got some fucking nerve, worrying about the "ethics" of putting a camera on your baby's head. Is it ethical to pay some ghoul in a white robe to seize your baby's legs with serrated stainless steel forceps and rip them out of her tiny body? Of course it is! Why wouldn't it be? A woman, whatever that is, chose to do it. Women, whatever those might be, never choose to do anything unethical. I think it's in the Constitution. If it isn't, it should be. Anything else is a false narrative.

tommyesq said...

Anything that Jim Gaffigan may do that will land his fat ass in a prison cell should not only be permitted but encouraged, as well.

Who could possibly dislike Jim Gaffigan??

Jamie said...

Howard at 8:14, exactly! Heisenberg! I was going to make precisely that observation.

Yancey Ward said...

Howard,

The Heisenberg Principle really doesn't have anything at all to do with observer effects in studying a human being in a psychological experiment. They are completely different principles that are somewhat similar. To make the point- it is entirely possible to remove the observer effect in psychology experiments by removing the knowledge that there is an observer. The Heisenberg Principle can't be evaded in a similar manner.

Joe Smith said...

Gaffigan is an unhinged lunatic lefty.

He is in Howard Stern/Kathy Griffin territory, so what he has to say about things concerns me not...

rehajm said...

Who could possibly dislike Jim Gaffigan??

RAISES HAND

mikee said...

A data point from my own lived experience: Don't cut the hair of your child for the first time, or heck, the first ten times, without your wife present for the event. Don't believe her when her explicit approval for the activity is granted. It isn't enough. There is a basis for the myth of Sampson, and you can experience it personally if you don't have her there for the shearing of the locks.

mikee said...

For the first year, a kiddy cam would be observing the parents and others interacting with the kiddy more than anything the baby does. Once mobility is achieved, then these parents and others not only have interactions with the kiddy, they sometimes have to chase the little ones down and wrangle them back to where they belong. That's when the videos get interesting.

Yinzer said...

Anyone who would subject any child to this is a self-absorbed idiot. Probably the same type that would decide that their baby wants to be trans.

Jupiter said...

"AI wants you to look at comedians with high social credit scores so it pulled one out of the hopper that hit your keywords."

Hmmmm, that is an aspect of the situation I had not considered. AI, as an approach to programing, involves scanning vast numbers of "target" texts, and essentially extracting rules for combining words. However, there is nothing to prevent the owners of the AI platform from introducing other criteria as well. Indeed, we already saw them adjusting the program settings so we got black Vikings. It is probably a mistake to assume that Grok does what Grok's owners say Grok does.

Bunkypotatohead said...

Google pays their maps drivers around $35/hr. If the kid starts early it might be able to fund it's college education.
Surely Google would love to collect a lifetimes worth of information on everyone.