29 Apr

the anxiety of the algorithm age

people are here

People are here.

This blog begins and is influenced by a moment of anxiety of what our world can become. What I know living in New York City as a Black Haitian American male, in an America still reeling from the economic turmoil of the Aughts and the start of a second Gilded Age, is we have a problem with who can be part of the effort writing the future. There has been some expansion to the kinds of people who can enjoy this privilege, but for the most part, diversity in the form of gender, race, sexual orientation and class have found immense barriers to having a prominent standing in our culture. As the demographics of our country change, so does the need to increase the volume on these topics.

This conversation is about diversity. This conversation is about bias. This conversation is about technology. This conversation is about how when you do something as simple as grab your phone, and reach out to someone, there are hidden costs in the way our diversity shapes that experience.

This conversation is about the anxiety of algorithms. Machines and the algorithms we give them are becoming better, faster, more useful, and everywhere: from choosing customers to predicting a child’s best fit occupation. What is hard to uncover is how personal experience and bias plays a part in algorithm creation. Soon, machines will be able to come up with their own algorithms, injecting our biases with the efficiency of silicon and megaflops.

I hope that I can help shepherd this conversation toward understanding privilege: the tendency for all of us to believe our personal experience is the norm. I hope we can become sensitive to how that tendency plays into how we build and evaluate our tools.

This is not about a moral panic. This is about understanding the hidden costs in our technology and designing a future for not just yourself.

22 Sep

“Facebook’s racist ad problems were baked in from the start”

“Facebook and Google, two of world’s biggest and most influential companies, pride themselves on their ad businesses. These operations generate tens of billions of dollars per year, thanks in part to letting advertisers target even the most obscure microcommunities using unprecedented sets of data.”

Read: Facebook’s racist ad problems were baked in from the start

22 Sep

“The invention of AI ‘gaydar’ could be the start of something much worse”

“Two weeks ago, a pair of researchers from Stanford University made a startling claim. Using hundreds of thousands of images taken from a dating website, they said they had trained a facial recognition system that could identify whether someone was straight or gay just by looking at them.”

Read: The invention of AI ‘gaydar’ could be the start of something much worse

22 Sep

“The invention of AI ‘gaydar’ could be the start of something much worse”

“Two weeks ago, a pair of researchers from Stanford University made a startling claim. Using hundreds of thousands of images taken from a dating website, they said they had trained a facial recognition system that could identify whether someone was straight or gay just by looking at them.”

Read: The invention of AI ‘gaydar’ could be the start of something much worse

22 Sep

“The invention of AI ‘gaydar’ could be the start of something much worse”

“Two weeks ago, a pair of researchers from Stanford University made a startling claim. Using hundreds of thousands of images taken from a dating website, they said they had trained a facial recognition system that could identify whether someone was straight or gay just by looking at them.”

Read: The invention of AI ‘gaydar’ could be the start of something much worse

22 Sep

“The invention of AI ‘gaydar’ could be the start of something much worse”

“Two weeks ago, a pair of researchers from Stanford University made a startling claim. Using hundreds of thousands of images taken from a dating website, they said they had trained a facial recognition system that could identify whether someone was straight or gay just by looking at them.”

Read: The invention of AI ‘gaydar’ could be the start of something much worse

Contact us

[contact-form-7 id="8" title="Contact form 1"]

Your Name (required)

Your Email (required)

Subject

Your Message

×