29 Apr

the anxiety of the algorithm age

people are here

People are here.

This blog begins and is influenced by a moment of anxiety of what our world can become. What I know living in New York City as a Black Haitian American male, in an America still reeling from the economic turmoil of the Aughts and the start of a second Gilded Age, is we have a problem with who can be part of the effort writing the future. There has been some expansion to the kinds of people who can enjoy this privilege, but for the most part, diversity in the form of gender, race, sexual orientation and class have found immense barriers to having a prominent standing in our culture. As the demographics of our country change, so does the need to increase the volume on these topics.

This conversation is about diversity. This conversation is about bias. This conversation is about technology. This conversation is about how when you do something as simple as grab your phone, and reach out to someone, there are hidden costs in the way our diversity shapes that experience.

This conversation is about the anxiety of algorithms. Machines and the algorithms we give them are becoming better, faster, more useful, and everywhere: from choosing customers to predicting a child’s best fit occupation. What is hard to uncover is how personal experience and bias plays a part in algorithm creation. Soon, machines will be able to come up with their own algorithms, injecting our biases with the efficiency of silicon and megaflops.

I hope that I can help shepherd this conversation toward understanding privilege: the tendency for all of us to believe our personal experience is the norm. I hope we can become sensitive to how that tendency plays into how we build and evaluate our tools.

This is not about a moral panic. This is about understanding the hidden costs in our technology and designing a future for not just yourself.

07 Dec

“Child welfare technology proved unreliable, DCFS chief tells Tribune”

“The Illinois Department of Children and Family Services is ending a high-profile program that used computer data mining to identify children at risk for serious injury or death after the agency’s top official called the technology unreliable.”

Read: Child welfare technology proved unreliable, DCFS chief tells Tribune

Contact us

[contact-form-7 id="8" title="Contact form 1"]

Your Name (required)

Your Email (required)

Subject

Your Message

×