BhumikaMak commited on
Commit
a91d21f
·
verified ·
1 Parent(s): 6037793

change desc colour

Browse files
Files changed (1) hide show
  1. app.py +3 -3
app.py CHANGED
@@ -117,7 +117,7 @@ body {
117
  text-align: center;
118
  }
119
  #neural-vista-text {
120
- color: purple !important; /* Purple color for the title */
121
  font-size: 18px; /* Adjust font size as needed */
122
  font-weight: bold;
123
  text-align: center;
@@ -191,9 +191,9 @@ with gr.Blocks(css=custom_css) as interface:
191
 
192
  gr.HTML("""
193
  <span style="font-family: 'Papyrus', cursive; font-size: 14px;">
194
- Concept Discovery is the process of uncovering the hidden, high-level features that a deep learning model has learned. It provides a way to understand the essence of its internal representations, akin to peering into the mind of the model and revealing the meaningful patterns it detects in the data.
195
  <br><br>
196
- Deep Feature Factorization (DFF) serves as a tool for breaking down these complex features into simpler, more interpretable components. By applying matrix factorization on activation maps, it untangles the intricate web of learned representations, making it easier to comprehend what the model is truly focusing on. Together, these methods bring us closer to understanding the underlying logic of neural networks, shedding light on the often enigmatic decisions they make.
197
  </span>
198
  """)
199
 
 
117
  text-align: center;
118
  }
119
  #neural-vista-text {
120
+ color: #800000 !important; /* Purple color for the title */
121
  font-size: 18px; /* Adjust font size as needed */
122
  font-weight: bold;
123
  text-align: center;
 
191
 
192
  gr.HTML("""
193
  <span style="font-family: 'Papyrus', cursive; font-size: 14px;">
194
+ <span style="color: #800000 ;">Concept Discovery</span> is the process of uncovering the hidden, high-level features that a deep learning model has learned. It provides a way to understand the essence of its internal representations, akin to peering into the mind of the model and revealing the meaningful patterns it detects in the data.
195
  <br><br>
196
+ <span style="color: #800000 ;">Deep Feature Factorization</span> (DFF) serves as a tool for breaking down these complex features into simpler, more interpretable components. By applying matrix factorization on activation maps, it untangles the intricate web of learned representations, making it easier to comprehend what the model is truly focusing on. Together, these methods bring us closer to understanding the underlying logic of neural networks, shedding light on the often enigmatic decisions they make.
197
  </span>
198
  """)
199