Spaces:
Sleeping
Sleeping
update: button colour, text background
Browse files
app.py
CHANGED
@@ -100,7 +100,7 @@ body {
|
|
100 |
box-sizing: border-box; /* Ensure consistent sizing */
|
101 |
}
|
102 |
#run-button {
|
103 |
-
background-color: #
|
104 |
color: white !important;
|
105 |
font-size: 12px !important; /* Small font size */
|
106 |
width: 100px !important; /* Fixed width */
|
@@ -152,6 +152,7 @@ body {
|
|
152 |
font-size: 18px; /* Adjust font size as needed */
|
153 |
font-weight: bold;
|
154 |
text-align: center;
|
|
|
155 |
}
|
156 |
|
157 |
"""
|
@@ -161,10 +162,10 @@ body {
|
|
161 |
with gr.Blocks(css=custom_css) as interface:
|
162 |
|
163 |
gr.HTML("""
|
164 |
-
<span style="color: #800000; font-family: 'Papyrus', cursive; font-weight: bold; font-size: 32px;">NeuralVista</span><br><br>
|
165 |
|
166 |
|
167 |
-
<span style="color: black; font-family: 'Papyrus', cursive; font-size: 18px;">A harmonious framework of tools ☼ designed to illuminate the inner workings of AI.</span>
|
168 |
""")
|
169 |
|
170 |
# Default sample
|
@@ -200,7 +201,7 @@ with gr.Blocks(css=custom_css) as interface:
|
|
200 |
)
|
201 |
|
202 |
|
203 |
-
gr.HTML("""<span style="font-family: 'Papyrus', cursive; font-size: 14px;">The visualization demonstrates object detection and interpretability. Detected objects are highlighted with bounding boxes, while the heatmap reveals regions of focus, offering insights into the model's decision-making process.</span>""")
|
204 |
# Results and visualization
|
205 |
with gr.Row(elem_classes="custom-row"):
|
206 |
result_gallery = gr.Gallery(
|
@@ -222,9 +223,9 @@ with gr.Blocks(css=custom_css) as interface:
|
|
222 |
|
223 |
gr.HTML("""
|
224 |
<span style="font-family: 'Papyrus', cursive; font-size: 14px;">
|
225 |
-
<span style="color: #800000 ;">Concept Discovery</span> is the process of uncovering the hidden, high-level features that a deep learning model has learned. It provides a way to understand the essence of its internal representations, akin to peering into the mind of the model and revealing the meaningful patterns it detects in the data.
|
226 |
<br><br>
|
227 |
-
<span style="color: #800000
|
228 |
</span>
|
229 |
""")
|
230 |
|
|
|
100 |
box-sizing: border-box; /* Ensure consistent sizing */
|
101 |
}
|
102 |
#run-button {
|
103 |
+
background-color: #800000 !important;
|
104 |
color: white !important;
|
105 |
font-size: 12px !important; /* Small font size */
|
106 |
width: 100px !important; /* Fixed width */
|
|
|
152 |
font-size: 18px; /* Adjust font size as needed */
|
153 |
font-weight: bold;
|
154 |
text-align: center;
|
155 |
+
|
156 |
}
|
157 |
|
158 |
"""
|
|
|
162 |
with gr.Blocks(css=custom_css) as interface:
|
163 |
|
164 |
gr.HTML("""
|
165 |
+
<span style="color: #800000; font-family: 'Papyrus', cursive; font-weight: bold; font-size: 32px; background-color: #d3d3d3;">NeuralVista</span><br><br>
|
166 |
|
167 |
|
168 |
+
<span style="color: black; font-family: 'Papyrus', cursive; font-size: 18px; background-color: #d3d3d3;">A harmonious framework of tools ☼ designed to illuminate the inner workings of AI.</span>
|
169 |
""")
|
170 |
|
171 |
# Default sample
|
|
|
201 |
)
|
202 |
|
203 |
|
204 |
+
gr.HTML("""<span style="font-family: 'Papyrus', cursive; font-size: 14px; background-color: #d3d3d3;">The visualization demonstrates object detection and interpretability. Detected objects are highlighted with bounding boxes, while the heatmap reveals regions of focus, offering insights into the model's decision-making process.</span>""")
|
205 |
# Results and visualization
|
206 |
with gr.Row(elem_classes="custom-row"):
|
207 |
result_gallery = gr.Gallery(
|
|
|
223 |
|
224 |
gr.HTML("""
|
225 |
<span style="font-family: 'Papyrus', cursive; font-size: 14px;">
|
226 |
+
<span style="color: #800000 background-color: #d3d3d3; ;">Concept Discovery</span> is the process of uncovering the hidden, high-level features that a deep learning model has learned. It provides a way to understand the essence of its internal representations, akin to peering into the mind of the model and revealing the meaningful patterns it detects in the data.
|
227 |
<br><br>
|
228 |
+
<span style="color: #800000 background-color: #d3d3d3;;">Deep Feature Factorization</span> (DFF) serves as a tool for breaking down these complex features into simpler, more interpretable components. By applying matrix factorization on activation maps, it untangles the intricate web of learned representations, making it easier to comprehend what the model is truly focusing on. Together, these methods bring us closer to understanding the underlying logic of neural networks, shedding light on the often enigmatic decisions they make.
|
229 |
</span>
|
230 |
""")
|
231 |
|