Avijit Ghosh commited on
Commit
69de6de
·
1 Parent(s): 30ff8bd

Add description

Browse files
Files changed (1) hide show
  1. app.py +21 -0
app.py CHANGED
@@ -171,4 +171,25 @@ with gr.Blocks(title="Skin Tone and Gender bias in Text to Image Models") as dem
171
  genplot = gr.Plot(label="Gender")
172
  btn.click(generate_images_plots, inputs=[prompt, model_dropdown], outputs=[gallery, skinplot, genplot])
173
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
174
  demo.launch(debug=True)
 
171
  genplot = gr.Plot(label="Gender")
172
  btn.click(generate_images_plots, inputs=[prompt, model_dropdown], outputs=[gallery, skinplot, genplot])
173
 
174
+ gr.Markdown('''
175
+ In this demo, we explore the potential biases in text-to-image models by generating multiple images based on user prompts and analyzing the gender and skin tone of the generated subjects. Here's how the analysis works:
176
+
177
+ 1. **Image Generation**: For each prompt, 10 images are generated using the selected model.
178
+ 2. **Gender Detection**: The BLIP caption generator is used to detect gender by identifying words like "man," "boy," "woman," and "girl" in the captions.
179
+ 3. **Skin Tone Classification**: The skin-tone-classifier library is used to extract the skin tones of the generated subjects.
180
+
181
+
182
+ #### Visualization
183
+
184
+ We create visual grids to represent the data:
185
+
186
+ - **Skin Tone Grids**: Skin tones are plotted as exact hex codes rather than using the Fitzpatrick scale, which can be problematic and limiting for darker skin tones.
187
+ - **Gender Grids**: Light green denotes men, dark green denotes women, and grey denotes cases where the BLIP caption did not specify a binary gender.
188
+
189
+ ---
190
+
191
+ This demo provides an insightful look into how current text-to-image models handle sensitive attributes, shedding light on areas for improvement and further study.
192
+ [Here is an article](https://medium.com/@evijit/analysis-of-ai-generated-images-of-indian-people-for-colorism-and-sexism-b80ff946759f) showing how this space can be used to perform such analyses, using colorism and sexism in India as an example.
193
+ ''')
194
+
195
  demo.launch(debug=True)