Spaces:
Running
Running
rmm
commited on
Commit
·
16fcc83
1
Parent(s):
eaa1c71
docs: added some notes on how to run tests
Browse files- docs/dev_notes.md +46 -4
docs/dev_notes.md
CHANGED
@@ -5,7 +5,7 @@ We set this up so it is hosted as a huggingface space. Each commit to `main` tri
|
|
5 |
For local testing, assuming you have all the required packages installed in a
|
6 |
conda env or virtualenv, and that env is activated:
|
7 |
|
8 |
-
```
|
9 |
cd src
|
10 |
streamlit run main.py
|
11 |
```
|
@@ -17,15 +17,17 @@ We have a CI action to presesnt the docs on github.io.
|
|
17 |
To validate locally, you need the deps listed in `requirements.txt` installed.
|
18 |
|
19 |
Run
|
20 |
-
```
|
21 |
mkdocs serve
|
22 |
```
|
|
|
23 |
And navigate to the wish server running locally, by default: http://127.0.0.1:8888/
|
24 |
|
25 |
This automatically watches for changes in the markdown files, but if you edit the
|
26 |
something else like the docstrings in py files, triggering a rebuild in another terminal
|
27 |
refreshes the site, without having to quit and restart the server.
|
28 |
-
|
|
|
29 |
mkdocs build -c
|
30 |
```
|
31 |
|
@@ -37,4 +39,44 @@ mkdocs build -c
|
|
37 |
|
38 |
# Set up a conda env
|
39 |
|
40 |
-
(Standard stuff)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
For local testing, assuming you have all the required packages installed in a
|
6 |
conda env or virtualenv, and that env is activated:
|
7 |
|
8 |
+
```bash
|
9 |
cd src
|
10 |
streamlit run main.py
|
11 |
```
|
|
|
17 |
To validate locally, you need the deps listed in `requirements.txt` installed.
|
18 |
|
19 |
Run
|
20 |
+
```bash
|
21 |
mkdocs serve
|
22 |
```
|
23 |
+
|
24 |
And navigate to the wish server running locally, by default: http://127.0.0.1:8888/
|
25 |
|
26 |
This automatically watches for changes in the markdown files, but if you edit the
|
27 |
something else like the docstrings in py files, triggering a rebuild in another terminal
|
28 |
refreshes the site, without having to quit and restart the server.
|
29 |
+
|
30 |
+
```bash
|
31 |
mkdocs build -c
|
32 |
```
|
33 |
|
|
|
39 |
|
40 |
# Set up a conda env
|
41 |
|
42 |
+
(Standard stuff)
|
43 |
+
|
44 |
+
|
45 |
+
# Testing
|
46 |
+
|
47 |
+
## local testing
|
48 |
+
To run the tests locally, we have the standard dependencies of the project, plus the test runner dependencies.
|
49 |
+
|
50 |
+
```bash
|
51 |
+
pip install -r tests/requirements.txt
|
52 |
+
```
|
53 |
+
|
54 |
+
(If we migrate to using toml config, the test reqs could be consolidated into an optional section)
|
55 |
+
|
56 |
+
|
57 |
+
**Running tests**
|
58 |
+
from the project root, simply run:
|
59 |
+
|
60 |
+
```bash
|
61 |
+
pytest
|
62 |
+
# or pick a specific test file to run
|
63 |
+
pytest tests/test_whale_viewer.py
|
64 |
+
```
|
65 |
+
|
66 |
+
To generate a coverage report to screen (also run the tests):
|
67 |
+
```bash
|
68 |
+
pytest --cov=src
|
69 |
+
```
|
70 |
+
|
71 |
+
To generate reports on pass rate and coverage, to files:
|
72 |
+
```bash
|
73 |
+
pytest --junit-xml=test-results.xml
|
74 |
+
pytest --cov-report=lcov --cov=src
|
75 |
+
```
|
76 |
+
|
77 |
+
|
78 |
+
## CI testing
|
79 |
+
|
80 |
+
Initially we have an action setup that runs all tests in the `tests` directory, within the `test/tests` branch.
|
81 |
+
|
82 |
+
TODO: Add some test report & coverage badges to the README.
|