29 Apr

the anxiety of the algorithm age

people are here

People are here.

This blog begins and is influenced by a moment of anxiety of what our world can become. What I know living in New York City as a Black Haitian American male, in an America still reeling from the economic turmoil of the Aughts and the start of a second Gilded Age, is we have a problem with who can be part of the effort writing the future. There has been some expansion to the kinds of people who can enjoy this privilege, but for the most part, diversity in the form of gender, race, sexual orientation and class have found immense barriers to having a prominent standing in our culture. As the demographics of our country change, so does the need to increase the volume on these topics.

This conversation is about diversity. This conversation is about bias. This conversation is about technology. This conversation is about how when you do something as simple as grab your phone, and reach out to someone, there are hidden costs in the way our diversity shapes that experience.

This conversation is about the anxiety of algorithms. Machines and the algorithms we give them are becoming better, faster, more useful, and everywhere: from choosing customers to predicting a child’s best fit occupation. What is hard to uncover is how personal experience and bias plays a part in algorithm creation. Soon, machines will be able to come up with their own algorithms, injecting our biases with the efficiency of silicon and megaflops.

I hope that I can help shepherd this conversation toward understanding privilege: the tendency for all of us to believe our personal experience is the norm. I hope we can become sensitive to how that tendency plays into how we build and evaluate our tools.

This is not about a moral panic. This is about understanding the hidden costs in our technology and designing a future for not just yourself.

22 May

“Amazon is selling police departments a real-time facial recognition system”

“Documents obtained by the ACLU of Northern California have shed new light on Rekognition, Amazon’s little-known facial recognition project. Rekognition is currently used by police in Orlando and Oregon’s Washington County, often using nondisclosure agreements to avoid public disclosure.”

Read: Amazon is selling police departments a real-time facial recognition system

22 May

“Amazon is selling police departments a real-time facial recognition system”

“Documents obtained by the ACLU of Northern California have shed new light on Rekognition, Amazon’s little-known facial recognition project. Rekognition is currently used by police in Orlando and Oregon’s Washington County, often using nondisclosure agreements to avoid public disclosure.”

Read: Amazon is selling police departments a real-time facial recognition system

13 May

“YouTube’s search suggests racist autocompletes”

“Type the perfectly innocuous phrase “black men are” into YouTube’s search box, and the site will automatically suggest a number of racist results, such as “black men are weak minded,” “black men are too emotional,” and “black men are coons,” among many others.”

Read: YouTube’s search suggests racist autocompletes

26 Apr

“Axon launches AI ethics board to study the dangers of facial recognition”

“Axon, formerly known as Taser, has launched a new “AI ethics board” to guide its use of artificial intelligence. The board will meet twice a year to discuss the ethical implications of upcoming Axon products, particularly how their use might affect community policing.”

Read: Axon launches AI ethics board to study the dangers of facial recognition

26 Apr

“Axon launches AI ethics board to study the dangers of facial recognition”

“Axon, formerly known as Taser, has launched a new “AI ethics board” to guide its use of artificial intelligence. The board will meet twice a year to discuss the ethical implications of upcoming Axon products, particularly how their use might affect community policing.”

Read: Axon launches AI ethics board to study the dangers of facial recognition

26 Apr

“Axon launches AI ethics board to study the dangers of facial recognition”

“Axon, formerly known as Taser, has launched a new “AI ethics board” to guide its use of artificial intelligence. The board will meet twice a year to discuss the ethical implications of upcoming Axon products, particularly how their use might affect community policing.”

Read: Axon launches AI ethics board to study the dangers of facial recognition

26 Apr

“Axon launches AI ethics board to study the dangers of facial recognition”

“Axon, formerly known as Taser, has launched a new “AI ethics board” to guide its use of artificial intelligence. The board will meet twice a year to discuss the ethical implications of upcoming Axon products, particularly how their use might affect community policing.”

Read: Axon launches AI ethics board to study the dangers of facial recognition

24 Apr

“10 years in, the Marvel Cinematic Universe still lacks diversity — and these 4 graphs prove it”

“Marvel’s Avengers: Infinity War, out in theaters April 27, promises to be a culmination of every origin story, direct sequel and mash-up that the Marvel Cinematic Universe has produced over its 10-year history, starting with 2008’s Iron Man and running through February’s Black Panther.”

Read: 10 years in, the Marvel Cinematic Universe still lacks diversity — and these 4 graphs prove it

Contact us

[contact-form-7 id="8" title="Contact form 1"]

Your Name (required)

Your Email (required)

Subject

Your Message

×