Transcribe your podcast
[00:00:02]

I may leave you with Ted talks daily when machines learn what exactly are they learning from us? Sociologist Veronica Barazi brings up some rather terrifying examples of A.I. and how companies start collecting data on our kids before they're even born. In her eye opening talk from Ted Mile High in 2019. She asks us how much we should really trust artificial intelligence for ourselves and our children. And she shed some light on what we need to do now to stand up for our personal data.

[00:00:38]

Every day, every week, we agree to terms and conditions, and when we do this, we provide companies with a lawful right to do whatever they want with our data and with the data of our children's.

[00:00:54]

Which makes us wonder how much data are we giving away of children and what are its implications?

[00:01:04]

I'm an anthropologist and I'm also the mother of two little girls, and I started to become interested in these questions in 2015 when I suddenly realized that there were vast, almost unimaginable amounts of data choices that are being produced and collected about children.

[00:01:22]

So I launched the research project, which is called Child Birth a Citizen, and that aimed at filling in the blank. Now, you may think that I'm here to blame you for posting photos of your children on social media, but that's not really the point. The problem is way bigger than so-called Sherrington. This is about systems, not individuals, you and your habits are not to blame. For the very first time in history, we are tracking the individual details.

[00:01:55]

Children from long before they're born, sometimes from the moment of conception and then throughout their lives, you see, when parents decide to conceive, they go online to look for ways to get pregnant or they download evolution tracking apps.

[00:02:13]

When they do get pregnant, they post ultrasound of their babies on social media, they download pregnancy apps or they consult Dr. Google for all sorts of things like, you know, miscarriage risk when flying or abdominal cramps in early pregnancy. I know, because I've done it and many times. And then when the baby's born, they track every knock, every feed, every life event on different technologies. And all this technology transformed a baby's most intimate behavioral and health data into profit by sharing it with others.

[00:02:54]

So to give you an idea of how this works, in 2019, the British Medical Journal published the research that showed that out of 24 mobile health apps, 19 shared information with third parties.

[00:03:09]

And these third parties shared information with 216 other organizations. Of this 216 other fourth parties, only three belong to the health sector. They are the companies that had access to that data were big tech companies like Google, Facebook or Oracle. They were digital advertising companies. And there was also a consumer credit reporting agencies.

[00:03:39]

So you get it right at companies and credit agency may already have datapoints on little babies. But mobile apps, web searches and social media are really just the tip of the iceberg because children are being tracked by multiple technologies in their everyday life. They are tracked by home technologies and virtual assistants in their homes.

[00:04:01]

They're tracked by educational platforms and educational technologies in their schools.

[00:04:05]

They're tracked by online records and online portals at their doctor's office to be tracked by the Internet, connected toys, the online games and many, many, many, many other technologies.

[00:04:18]

So during my research, a lot of parents came up to me and they were like, so what? Why does it matter if my children are being tracked?

[00:04:27]

We got nothing to hide. Well, it matters. It matters because today individuals are not only being tracked. They are also being profiled on the basis of their data traces.

[00:04:43]

Artificial intelligence and predictive analytics are being used to harness as much as possible of an individual life from different sources family history, purchasing habits, social media comments, and then they bring this data together to make data driven decisions about the individual.

[00:05:02]

And these technologies are used everywhere, banks use them to decide loans, insurers use them to decide premiums. Recruiters and employers use them to decide whether one is a good fit for a job or not.

[00:05:18]

Also, the police and courts use them to determine whether one is a potential criminal or is likely to commit a crime.

[00:05:30]

We have no knowledge or control over the ways in which those who buy, sell and process our data are profiling us and our children, but these profiles can come to impact our rights in significant ways.

[00:05:46]

To give you an example.

[00:05:51]

In 2018, The New York Times published the news that the data that had been gathered through online college planning service there are actually completed by millions of high school kids across the U.S. who are looking for a college program or scholarship had been sold to educational data brokers.

[00:06:13]

Now, educational researchers at Fordham who studied educational data brokers revealed that these companies profiled kids as young as two on the basis of different categories.

[00:06:27]

Ethnicity, religion, affluence. Social awkwardness and many other random categories, and then they sell these profiles together with the name of the kid, the home address and the contact details to different companies, including trading career institutions.

[00:06:50]

Student loans and student credit card companies.

[00:06:54]

To push the boundaries, the researchers are for them as an educational data broker, to provide them with a list of 14 to 15 year old girls who are interested in family planning services.

[00:07:10]

The data broker agreed to provide them the list, so imagine how intimate and how intrusive that is for our kids.

[00:07:18]

But educational data brokers are really just an example. The truth is that our children are being profiled in ways that we cannot control, but that can significantly impact their chances in life.

[00:07:32]

So we need to ask ourselves, can we trust these technologies when it comes to profiling our children? Can we? My answer is no. As an anthropologist, I believe that artificial intelligence and predictive analytics can be great to predict the course of a disease or to fight climate change.

[00:07:55]

But we need to abandon the belief that these technologies can objectively profile humans and that we can rely on them to make data driven decisions about individual lives because they can't profile humans.

[00:08:09]

Data traces are not the mirror of who we are. Humans think one thing and say the opposite. Feel one way and act differently.

[00:08:16]

Algorithmic predictions or our digital practices cannot account for the unpredictability and complexity of human experience.

[00:08:26]

But on top of that, these technologies are always, always, in one way or another, biased. You see, our groups are, by definition, sets of rules or steps that have been designed to achieve a specific result.

[00:08:43]

OK. But this sets of rules or steps cannot be objective because to be designed by human beings with these specific cultural context and are shaped by specific cultural values.

[00:08:54]

So when machines learn, they learn from biased algorithms. And they often learn from biased databases as well. At the moment, we're seeing the first examples of algorithmic bias and some of these examples are frankly terrifying. This year, they now institute in New York published a report that revealed that they are technologies that are being used for predictive policing, have been trained on dirty data.

[00:09:26]

This is basically data that had been gathered during historical periods of known racial bias and non-transparent police practices.

[00:09:36]

Because these technologies are being trained with dirty data, general objective, and their outcomes are only amplifying and perpetrating police bias and error.

[00:09:51]

So I think we we are faced with a fundamental problem in our society. We are starting to trust technologies when it comes to profiling human beings.

[00:10:01]

We know that in profiling humans, these technologies are always going to be biased and are never really going to be accurate. So what we need now is actually a political solution. We need governments to recognize that our data rights are human rights.

[00:10:25]

Until this happens, we cannot hope for a more just future. I worry that my daughters are going to be exposed to all sorts of algorithmic discrimination and error.

[00:10:37]

You see, the difference between me and my daughter is that there is no public record out there of my childhood, certainly no database of all the stupid things that have done and thought when I was a teenager.

[00:10:50]

Well, but for my daughters, this may be different. The data that is being collected from them today may be used to judge them in the future. And can come to prevent their hopes and dreams. I think that it's time it's time that we all step up, it's time that we start working together as individuals, as organizations and as institutions, and that we demand greater data justice for us and for our children before it's too late. Thank you.

[00:11:26]

Hi, I'm Russian, while a host of a new podcast from TED called Pingrup, every week you'll travel to a different location around the world, get lost in a new vibe and tap into a surprising idea. Next up, Nairobi, Kenya, where you'll hear about a new movement telling stories of joy and frivolity from Africa that's been dropped from Ted Chicot, pin drop on Apple podcast Spotify or wherever you listen.

[00:11:57]

PR ex.