BhumikaMak commited on
Commit
38883b2
·
verified ·
1 Parent(s): 232b1e9

update: bold text

Browse files
Files changed (1) hide show
  1. app.py +7 -6
app.py CHANGED
@@ -200,9 +200,10 @@ with gr.Blocks(css=custom_css) as interface:
200
  value=load_sample_image(default_sample),
201
  label="Selected Sample Image",
202
  )
203
-
204
-
205
- gr.HTML("""<span style="font-family: 'Papyrus', cursive; font-size: 14px;">The visualization demonstrates object detection and interpretability. Detected objects are highlighted with bounding boxes, while the heatmap reveals regions of focus, offering insights into the model's decision-making process.</span>""")
 
206
  # Results and visualization
207
  with gr.Row(elem_classes="custom-row"):
208
  result_gallery = gr.Gallery(
@@ -222,11 +223,11 @@ with gr.Blocks(css=custom_css) as interface:
222
  )
223
 
224
 
225
- gr.HTML("""
226
  <span style="font-family: 'Papyrus', cursive; font-size: 14px;">
227
- <span style="color: #800000;">Concept Discovery</span> is the process of uncovering the hidden, high-level features that a deep learning model has learned. It provides a way to understand the essence of its internal representations, akin to peering into the mind of the model and revealing the meaningful patterns it detects in the data.
228
  <br><br>
229
- <span style="color: #800000;">Deep Feature Factorization</span> (DFF) serves as a tool for breaking down these complex features into simpler, more interpretable components. By applying matrix factorization on activation maps, it untangles the intricate web of learned representations, making it easier to comprehend what the model is truly focusing on. Together, these methods bring us closer to understanding the underlying logic of neural networks, shedding light on the often enigmatic decisions they make.
230
  </span>
231
  """)
232
 
 
200
  value=load_sample_image(default_sample),
201
  label="Selected Sample Image",
202
  )
203
+
204
+ gr.HTML("""
205
+ <span style="font-family: 'Papyrus', cursive; font-size: 14px; font-weight: bold;">The visualization demonstrates object detection and interpretability. Detected objects are highlighted with bounding boxes, while the heatmap reveals regions of focus, offering insights into the model's decision-making process.</span>
206
+ """)
207
  # Results and visualization
208
  with gr.Row(elem_classes="custom-row"):
209
  result_gallery = gr.Gallery(
 
223
  )
224
 
225
 
226
+ gr.HTML("""
227
  <span style="font-family: 'Papyrus', cursive; font-size: 14px;">
228
+ <span style="color: #800000; font-weight: bold;">Concept Discovery</span> is the process of uncovering the hidden, high-level features that a deep learning model has learned. It provides a way to understand the essence of its internal representations, akin to peering into the mind of the model and revealing the meaningful patterns it detects in the data.
229
  <br><br>
230
+ <span style="color: #800000; font-weight: bold;">Deep Feature Factorization</span> (DFF) serves as a tool for breaking down these complex features into simpler, more interpretable components. By applying matrix factorization on activation maps, it untangles the intricate web of learned representations, making it easier to comprehend what the model is truly focusing on. Together, these methods bring us closer to understanding the underlying logic of neural networks, shedding light on the often enigmatic decisions they make.
231
  </span>
232
  """)
233