repo_name
stringclasses
1 value
pr_number
int64
4.12k
11.2k
pr_title
stringlengths
9
107
pr_description
stringlengths
107
5.48k
author
stringlengths
4
18
date_created
unknown
date_merged
unknown
previous_commit
stringlengths
40
40
pr_commit
stringlengths
40
40
query
stringlengths
118
5.52k
before_content
stringlengths
0
7.93M
after_content
stringlengths
0
7.93M
label
int64
-1
1
TheAlgorithms/Python
5,744
Improve Project Euler problem 014 solution 2
### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MaximSmolskiy
"2021-11-01T19:21:41Z"
"2021-11-04T16:01:22Z"
7a605766fe7fe79a00ba1f30447877be4b77a6f2
729aaf64275c61b8bc864ef9138eed078dea9cb2
Improve Project Euler problem 014 solution 2. ### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" * Binary Exponentiation with Multiplication * This is a method to find a*b in a time complexity of O(log b) * This is one of the most commonly used methods of finding result of multiplication. * Also useful in cases where solution to (a*b)%c is required, * where a,b,c can be numbers over the computers calculation limits. * Done using iteration, can also be done using recursion * @author chinmoy159 * @version 1.0 dated 10/08/2017 """ def b_expo(a, b): res = 0 while b > 0: if b & 1: res += a a += a b >>= 1 return res def b_expo_mod(a, b, c): res = 0 while b > 0: if b & 1: res = ((res % c) + (a % c)) % c a += a b >>= 1 return res """ * Wondering how this method works ! * It's pretty simple. * Let's say you need to calculate a ^ b * RULE 1 : a * b = (a+a) * (b/2) ---- example : 4 * 4 = (4+4) * (4/2) = 8 * 2 * RULE 2 : IF b is ODD, then ---- a * b = a + (a * (b - 1)) :: where (b - 1) is even. * Once b is even, repeat the process to get a * b * Repeat the process till b = 1 OR b = 0, because a*1 = a AND a*0 = 0 * * As far as the modulo is concerned, * the fact : (a+b) % c = ((a%c) + (b%c)) % c * Now apply RULE 1 OR 2, whichever is required. """
""" * Binary Exponentiation with Multiplication * This is a method to find a*b in a time complexity of O(log b) * This is one of the most commonly used methods of finding result of multiplication. * Also useful in cases where solution to (a*b)%c is required, * where a,b,c can be numbers over the computers calculation limits. * Done using iteration, can also be done using recursion * @author chinmoy159 * @version 1.0 dated 10/08/2017 """ def b_expo(a, b): res = 0 while b > 0: if b & 1: res += a a += a b >>= 1 return res def b_expo_mod(a, b, c): res = 0 while b > 0: if b & 1: res = ((res % c) + (a % c)) % c a += a b >>= 1 return res """ * Wondering how this method works ! * It's pretty simple. * Let's say you need to calculate a ^ b * RULE 1 : a * b = (a+a) * (b/2) ---- example : 4 * 4 = (4+4) * (4/2) = 8 * 2 * RULE 2 : IF b is ODD, then ---- a * b = a + (a * (b - 1)) :: where (b - 1) is even. * Once b is even, repeat the process to get a * b * Repeat the process till b = 1 OR b = 0, because a*1 = a AND a*0 = 0 * * As far as the modulo is concerned, * the fact : (a+b) % c = ((a%c) + (b%c)) % c * Now apply RULE 1 OR 2, whichever is required. """
-1
TheAlgorithms/Python
5,744
Improve Project Euler problem 014 solution 2
### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MaximSmolskiy
"2021-11-01T19:21:41Z"
"2021-11-04T16:01:22Z"
7a605766fe7fe79a00ba1f30447877be4b77a6f2
729aaf64275c61b8bc864ef9138eed078dea9cb2
Improve Project Euler problem 014 solution 2. ### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Conway's Game Of Life, Author Anurag Kumar(mailto:[email protected]) Requirements: - numpy - random - time - matplotlib Python: - 3.5 Usage: - $python3 game_o_life <canvas_size:int> Game-Of-Life Rules: 1. Any live cell with fewer than two live neighbours dies, as if caused by under-population. 2. Any live cell with two or three live neighbours lives on to the next generation. 3. Any live cell with more than three live neighbours dies, as if by over-population. 4. Any dead cell with exactly three live neighbours be- comes a live cell, as if by reproduction. """ import random import sys import numpy as np from matplotlib import pyplot as plt from matplotlib.colors import ListedColormap usage_doc = "Usage of script: script_nama <size_of_canvas:int>" choice = [0] * 100 + [1] * 10 random.shuffle(choice) def create_canvas(size: int) -> list[list[bool]]: canvas = [[False for i in range(size)] for j in range(size)] return canvas def seed(canvas: list[list[bool]]) -> None: for i, row in enumerate(canvas): for j, _ in enumerate(row): canvas[i][j] = bool(random.getrandbits(1)) def run(canvas: list[list[bool]]) -> list[list[bool]]: """This function runs the rules of game through all points, and changes their status accordingly.(in the same canvas) @Args: -- canvas : canvas of population to run the rules on. @returns: -- None """ current_canvas = np.array(canvas) next_gen_canvas = np.array(create_canvas(current_canvas.shape[0])) for r, row in enumerate(current_canvas): for c, pt in enumerate(row): # print(r-1,r+2,c-1,c+2) next_gen_canvas[r][c] = __judge_point( pt, current_canvas[r - 1 : r + 2, c - 1 : c + 2] ) current_canvas = next_gen_canvas del next_gen_canvas # cleaning memory as we move on. return_canvas: list[list[bool]] = current_canvas.tolist() return return_canvas def __judge_point(pt: bool, neighbours: list[list[bool]]) -> bool: dead = 0 alive = 0 # finding dead or alive neighbours count. for i in neighbours: for status in i: if status: alive += 1 else: dead += 1 # handling duplicate entry for focus pt. if pt: alive -= 1 else: dead -= 1 # running the rules of game here. state = pt if pt: if alive < 2: state = False elif alive == 2 or alive == 3: state = True elif alive > 3: state = False else: if alive == 3: state = True return state if __name__ == "__main__": if len(sys.argv) != 2: raise Exception(usage_doc) canvas_size = int(sys.argv[1]) # main working structure of this module. c = create_canvas(canvas_size) seed(c) fig, ax = plt.subplots() fig.show() cmap = ListedColormap(["w", "k"]) try: while True: c = run(c) ax.matshow(c, cmap=cmap) fig.canvas.draw() ax.cla() except KeyboardInterrupt: # do nothing. pass
"""Conway's Game Of Life, Author Anurag Kumar(mailto:[email protected]) Requirements: - numpy - random - time - matplotlib Python: - 3.5 Usage: - $python3 game_o_life <canvas_size:int> Game-Of-Life Rules: 1. Any live cell with fewer than two live neighbours dies, as if caused by under-population. 2. Any live cell with two or three live neighbours lives on to the next generation. 3. Any live cell with more than three live neighbours dies, as if by over-population. 4. Any dead cell with exactly three live neighbours be- comes a live cell, as if by reproduction. """ import random import sys import numpy as np from matplotlib import pyplot as plt from matplotlib.colors import ListedColormap usage_doc = "Usage of script: script_nama <size_of_canvas:int>" choice = [0] * 100 + [1] * 10 random.shuffle(choice) def create_canvas(size: int) -> list[list[bool]]: canvas = [[False for i in range(size)] for j in range(size)] return canvas def seed(canvas: list[list[bool]]) -> None: for i, row in enumerate(canvas): for j, _ in enumerate(row): canvas[i][j] = bool(random.getrandbits(1)) def run(canvas: list[list[bool]]) -> list[list[bool]]: """This function runs the rules of game through all points, and changes their status accordingly.(in the same canvas) @Args: -- canvas : canvas of population to run the rules on. @returns: -- None """ current_canvas = np.array(canvas) next_gen_canvas = np.array(create_canvas(current_canvas.shape[0])) for r, row in enumerate(current_canvas): for c, pt in enumerate(row): # print(r-1,r+2,c-1,c+2) next_gen_canvas[r][c] = __judge_point( pt, current_canvas[r - 1 : r + 2, c - 1 : c + 2] ) current_canvas = next_gen_canvas del next_gen_canvas # cleaning memory as we move on. return_canvas: list[list[bool]] = current_canvas.tolist() return return_canvas def __judge_point(pt: bool, neighbours: list[list[bool]]) -> bool: dead = 0 alive = 0 # finding dead or alive neighbours count. for i in neighbours: for status in i: if status: alive += 1 else: dead += 1 # handling duplicate entry for focus pt. if pt: alive -= 1 else: dead -= 1 # running the rules of game here. state = pt if pt: if alive < 2: state = False elif alive == 2 or alive == 3: state = True elif alive > 3: state = False else: if alive == 3: state = True return state if __name__ == "__main__": if len(sys.argv) != 2: raise Exception(usage_doc) canvas_size = int(sys.argv[1]) # main working structure of this module. c = create_canvas(canvas_size) seed(c) fig, ax = plt.subplots() fig.show() cmap = ListedColormap(["w", "k"]) try: while True: c = run(c) ax.matshow(c, cmap=cmap) fig.canvas.draw() ax.cla() except KeyboardInterrupt: # do nothing. pass
-1
TheAlgorithms/Python
5,744
Improve Project Euler problem 014 solution 2
### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MaximSmolskiy
"2021-11-01T19:21:41Z"
"2021-11-04T16:01:22Z"
7a605766fe7fe79a00ba1f30447877be4b77a6f2
729aaf64275c61b8bc864ef9138eed078dea9cb2
Improve Project Euler problem 014 solution 2. ### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def bin_exp_mod(a, n, b): """ >>> bin_exp_mod(3, 4, 5) 1 >>> bin_exp_mod(7, 13, 10) 7 """ # mod b assert not (b == 0), "This cannot accept modulo that is == 0" if n == 0: return 1 if n % 2 == 1: return (bin_exp_mod(a, n - 1, b) * a) % b r = bin_exp_mod(a, n / 2, b) return (r * r) % b if __name__ == "__main__": try: BASE = int(input("Enter Base : ").strip()) POWER = int(input("Enter Power : ").strip()) MODULO = int(input("Enter Modulo : ").strip()) except ValueError: print("Invalid literal for integer") print(bin_exp_mod(BASE, POWER, MODULO))
def bin_exp_mod(a, n, b): """ >>> bin_exp_mod(3, 4, 5) 1 >>> bin_exp_mod(7, 13, 10) 7 """ # mod b assert not (b == 0), "This cannot accept modulo that is == 0" if n == 0: return 1 if n % 2 == 1: return (bin_exp_mod(a, n - 1, b) * a) % b r = bin_exp_mod(a, n / 2, b) return (r * r) % b if __name__ == "__main__": try: BASE = int(input("Enter Base : ").strip()) POWER = int(input("Enter Power : ").strip()) MODULO = int(input("Enter Modulo : ").strip()) except ValueError: print("Invalid literal for integer") print(bin_exp_mod(BASE, POWER, MODULO))
-1
TheAlgorithms/Python
5,744
Improve Project Euler problem 014 solution 2
### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MaximSmolskiy
"2021-11-01T19:21:41Z"
"2021-11-04T16:01:22Z"
7a605766fe7fe79a00ba1f30447877be4b77a6f2
729aaf64275c61b8bc864ef9138eed078dea9cb2
Improve Project Euler problem 014 solution 2. ### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,744
Improve Project Euler problem 014 solution 2
### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MaximSmolskiy
"2021-11-01T19:21:41Z"
"2021-11-04T16:01:22Z"
7a605766fe7fe79a00ba1f30447877be4b77a6f2
729aaf64275c61b8bc864ef9138eed078dea9cb2
Improve Project Euler problem 014 solution 2. ### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations import math from typing import Callable def line_length( fnc: Callable[[int | float], int | float], x_start: int | float, x_end: int | float, steps: int = 100, ) -> float: """ Approximates the arc length of a line segment by treating the curve as a sequence of linear lines and summing their lengths :param fnc: a function which defines a curve :param x_start: left end point to indicate the start of line segment :param x_end: right end point to indicate end of line segment :param steps: an accuracy gauge; more steps increases accuracy :return: a float representing the length of the curve >>> def f(x): ... return x >>> f"{line_length(f, 0, 1, 10):.6f}" '1.414214' >>> def f(x): ... return 1 >>> f"{line_length(f, -5.5, 4.5):.6f}" '10.000000' >>> def f(x): ... return math.sin(5 * x) + math.cos(10 * x) + x * x/10 >>> f"{line_length(f, 0.0, 10.0, 10000):.6f}" '69.534930' """ x1 = x_start fx1 = fnc(x_start) length = 0.0 for i in range(steps): # Approximates curve as a sequence of linear lines and sums their length x2 = (x_end - x_start) / steps + x1 fx2 = fnc(x2) length += math.hypot(x2 - x1, fx2 - fx1) # Increment step x1 = x2 fx1 = fx2 return length if __name__ == "__main__": def f(x): return math.sin(10 * x) print("f(x) = sin(10 * x)") print("The length of the curve from x = -10 to x = 10 is:") i = 10 while i <= 100000: print(f"With {i} steps: {line_length(f, -10, 10, i)}") i *= 10
from __future__ import annotations import math from typing import Callable def line_length( fnc: Callable[[int | float], int | float], x_start: int | float, x_end: int | float, steps: int = 100, ) -> float: """ Approximates the arc length of a line segment by treating the curve as a sequence of linear lines and summing their lengths :param fnc: a function which defines a curve :param x_start: left end point to indicate the start of line segment :param x_end: right end point to indicate end of line segment :param steps: an accuracy gauge; more steps increases accuracy :return: a float representing the length of the curve >>> def f(x): ... return x >>> f"{line_length(f, 0, 1, 10):.6f}" '1.414214' >>> def f(x): ... return 1 >>> f"{line_length(f, -5.5, 4.5):.6f}" '10.000000' >>> def f(x): ... return math.sin(5 * x) + math.cos(10 * x) + x * x/10 >>> f"{line_length(f, 0.0, 10.0, 10000):.6f}" '69.534930' """ x1 = x_start fx1 = fnc(x_start) length = 0.0 for i in range(steps): # Approximates curve as a sequence of linear lines and sums their length x2 = (x_end - x_start) / steps + x1 fx2 = fnc(x2) length += math.hypot(x2 - x1, fx2 - fx1) # Increment step x1 = x2 fx1 = fx2 return length if __name__ == "__main__": def f(x): return math.sin(10 * x) print("f(x) = sin(10 * x)") print("The length of the curve from x = -10 to x = 10 is:") i = 10 while i <= 100000: print(f"With {i} steps: {line_length(f, -10, 10, i)}") i *= 10
-1
TheAlgorithms/Python
5,744
Improve Project Euler problem 014 solution 2
### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MaximSmolskiy
"2021-11-01T19:21:41Z"
"2021-11-04T16:01:22Z"
7a605766fe7fe79a00ba1f30447877be4b77a6f2
729aaf64275c61b8bc864ef9138eed078dea9cb2
Improve Project Euler problem 014 solution 2. ### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,744
Improve Project Euler problem 014 solution 2
### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MaximSmolskiy
"2021-11-01T19:21:41Z"
"2021-11-04T16:01:22Z"
7a605766fe7fe79a00ba1f30447877be4b77a6f2
729aaf64275c61b8bc864ef9138eed078dea9cb2
Improve Project Euler problem 014 solution 2. ### **Describe your change:** Improve Project Euler problem 014 solution 2 - the top 1 slowest solution on Travis CI logs (under `slowest 10 durations`: `14.20s call scripts/validate_solutions.py::test_project_euler[problem_014/sol2.py]`): * Improve solution (locally 10+ times - from 15+ seconds to ~1.5 seconds) * Uncomment code that has been commented due to slow execution affecting Travis (now it should be quite fast execution and not affect Travis) * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" If we are presented with the first k terms of a sequence it is impossible to say with certainty the value of the next term, as there are infinitely many polynomial functions that can model the sequence. As an example, let us consider the sequence of cube numbers. This is defined by the generating function, u(n) = n3: 1, 8, 27, 64, 125, 216, ... Suppose we were only given the first two terms of this sequence. Working on the principle that "simple is best" we should assume a linear relationship and predict the next term to be 15 (common difference 7). Even if we were presented with the first three terms, by the same principle of simplicity, a quadratic relationship should be assumed. We shall define OP(k, n) to be the nth term of the optimum polynomial generating function for the first k terms of a sequence. It should be clear that OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a bad OP (BOP). As a basis, if we were only given the first term of sequence, it would be most sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1). Hence we obtain the following OPs for the cubic sequence: OP(1, n) = 1 1, 1, 1, 1, ... OP(2, n) = 7n-6 1, 8, 15, ... OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ... OP(4, n) = n^3 1, 8, 27, 64, 125, ... Clearly no BOPs exist for k ≥ 4. By considering the sum of FITs generated by the BOPs (indicated in red above), we obtain 1 + 15 + 58 = 74. Consider the following tenth degree polynomial generating function: 1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10 Find the sum of FITs for the BOPs. """ from __future__ import annotations from typing import Callable, Union Matrix = list[list[Union[float, int]]] def solve(matrix: Matrix, vector: Matrix) -> Matrix: """ Solve the linear system of equations Ax = b (A = "matrix", b = "vector") for x using Gaussian elimination and back substitution. We assume that A is an invertible square matrix and that b is a column vector of the same height. >>> solve([[1, 0], [0, 1]], [[1],[2]]) [[1.0], [2.0]] >>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]]) [[2.0], [3.0], [-1.0]] """ size: int = len(matrix) augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)] row: int row2: int col: int col2: int pivot_row: int ratio: float for row in range(size): for col in range(size): augmented[row][col] = matrix[row][col] augmented[row][size] = vector[row][0] row = 0 col = 0 while row < size and col < size: # pivoting pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[ 1 ] if augmented[pivot_row][col] == 0: col += 1 continue else: augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row] for row2 in range(row + 1, size): ratio = augmented[row2][col] / augmented[row][col] augmented[row2][col] = 0 for col2 in range(col + 1, size + 1): augmented[row2][col2] -= augmented[row][col2] * ratio row += 1 col += 1 # back substitution for col in range(1, size): for row in range(col): ratio = augmented[row][col] / augmented[col][col] for col2 in range(col, size + 1): augmented[row][col2] -= augmented[col][col2] * ratio # round to get rid of numbers like 2.000000000000004 return [ [round(augmented[row][size] / augmented[row][row], 10)] for row in range(size) ] def interpolate(y_list: list[int]) -> Callable[[int], int]: """ Given a list of data points (1,y0),(2,y1), ..., return a function that interpolates the data points. We find the coefficients of the interpolating polynomial by solving a system of linear equations corresponding to x = 1, 2, 3... >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ size: int = len(y_list) matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)] vector: Matrix = [[0] for _ in range(size)] coeffs: Matrix x_val: int y_val: int col: int for x_val, y_val in enumerate(y_list): for col in range(size): matrix[x_val][col] = (x_val + 1) ** (size - col - 1) vector[x_val][0] = y_val coeffs = solve(matrix, vector) def interpolated_func(var: int) -> int: """ >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ return sum( round(coeffs[x_val][0]) * (var ** (size - x_val - 1)) for x_val in range(size) ) return interpolated_func def question_function(variable: int) -> int: """ The generating function u as specified in the question. >>> question_function(0) 1 >>> question_function(1) 1 >>> question_function(5) 8138021 >>> question_function(10) 9090909091 """ return ( 1 - variable + variable ** 2 - variable ** 3 + variable ** 4 - variable ** 5 + variable ** 6 - variable ** 7 + variable ** 8 - variable ** 9 + variable ** 10 ) def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int: """ Find the sum of the FITs of the BOPS. For each interpolating polynomial of order 1, 2, ... , 10, find the first x such that the value of the polynomial at x does not equal u(x). >>> solution(lambda n: n ** 3, 3) 74 """ data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)] polynomials: list[Callable[[int], int]] = [ interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1) ] ret: int = 0 poly: Callable[[int], int] x_val: int for poly in polynomials: x_val = 1 while func(x_val) == poly(x_val): x_val += 1 ret += poly(x_val) return ret if __name__ == "__main__": print(f"{solution() = }")
""" If we are presented with the first k terms of a sequence it is impossible to say with certainty the value of the next term, as there are infinitely many polynomial functions that can model the sequence. As an example, let us consider the sequence of cube numbers. This is defined by the generating function, u(n) = n3: 1, 8, 27, 64, 125, 216, ... Suppose we were only given the first two terms of this sequence. Working on the principle that "simple is best" we should assume a linear relationship and predict the next term to be 15 (common difference 7). Even if we were presented with the first three terms, by the same principle of simplicity, a quadratic relationship should be assumed. We shall define OP(k, n) to be the nth term of the optimum polynomial generating function for the first k terms of a sequence. It should be clear that OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a bad OP (BOP). As a basis, if we were only given the first term of sequence, it would be most sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1). Hence we obtain the following OPs for the cubic sequence: OP(1, n) = 1 1, 1, 1, 1, ... OP(2, n) = 7n-6 1, 8, 15, ... OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ... OP(4, n) = n^3 1, 8, 27, 64, 125, ... Clearly no BOPs exist for k ≥ 4. By considering the sum of FITs generated by the BOPs (indicated in red above), we obtain 1 + 15 + 58 = 74. Consider the following tenth degree polynomial generating function: 1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10 Find the sum of FITs for the BOPs. """ from __future__ import annotations from typing import Callable, Union Matrix = list[list[Union[float, int]]] def solve(matrix: Matrix, vector: Matrix) -> Matrix: """ Solve the linear system of equations Ax = b (A = "matrix", b = "vector") for x using Gaussian elimination and back substitution. We assume that A is an invertible square matrix and that b is a column vector of the same height. >>> solve([[1, 0], [0, 1]], [[1],[2]]) [[1.0], [2.0]] >>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]]) [[2.0], [3.0], [-1.0]] """ size: int = len(matrix) augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)] row: int row2: int col: int col2: int pivot_row: int ratio: float for row in range(size): for col in range(size): augmented[row][col] = matrix[row][col] augmented[row][size] = vector[row][0] row = 0 col = 0 while row < size and col < size: # pivoting pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[ 1 ] if augmented[pivot_row][col] == 0: col += 1 continue else: augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row] for row2 in range(row + 1, size): ratio = augmented[row2][col] / augmented[row][col] augmented[row2][col] = 0 for col2 in range(col + 1, size + 1): augmented[row2][col2] -= augmented[row][col2] * ratio row += 1 col += 1 # back substitution for col in range(1, size): for row in range(col): ratio = augmented[row][col] / augmented[col][col] for col2 in range(col, size + 1): augmented[row][col2] -= augmented[col][col2] * ratio # round to get rid of numbers like 2.000000000000004 return [ [round(augmented[row][size] / augmented[row][row], 10)] for row in range(size) ] def interpolate(y_list: list[int]) -> Callable[[int], int]: """ Given a list of data points (1,y0),(2,y1), ..., return a function that interpolates the data points. We find the coefficients of the interpolating polynomial by solving a system of linear equations corresponding to x = 1, 2, 3... >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ size: int = len(y_list) matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)] vector: Matrix = [[0] for _ in range(size)] coeffs: Matrix x_val: int y_val: int col: int for x_val, y_val in enumerate(y_list): for col in range(size): matrix[x_val][col] = (x_val + 1) ** (size - col - 1) vector[x_val][0] = y_val coeffs = solve(matrix, vector) def interpolated_func(var: int) -> int: """ >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ return sum( round(coeffs[x_val][0]) * (var ** (size - x_val - 1)) for x_val in range(size) ) return interpolated_func def question_function(variable: int) -> int: """ The generating function u as specified in the question. >>> question_function(0) 1 >>> question_function(1) 1 >>> question_function(5) 8138021 >>> question_function(10) 9090909091 """ return ( 1 - variable + variable ** 2 - variable ** 3 + variable ** 4 - variable ** 5 + variable ** 6 - variable ** 7 + variable ** 8 - variable ** 9 + variable ** 10 ) def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int: """ Find the sum of the FITs of the BOPS. For each interpolating polynomial of order 1, 2, ... , 10, find the first x such that the value of the polynomial at x does not equal u(x). >>> solution(lambda n: n ** 3, 3) 74 """ data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)] polynomials: list[Callable[[int], int]] = [ interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1) ] ret: int = 0 poly: Callable[[int], int] x_val: int for poly in polynomials: x_val = 1 while func(x_val) == poly(x_val): x_val += 1 ret += poly(x_val) return ret if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v3.4.0 hooks: - id: check-executables-have-shebangs - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace exclude: | (?x)^( data_structures/heap/binomial_heap.py )$ - id: requirements-txt-fixer - repo: https://github.com/psf/black rev: 21.4b0 hooks: - id: black - repo: https://github.com/PyCQA/isort rev: 5.8.0 hooks: - id: isort args: - --profile=black - repo: https://gitlab.com/pycqa/flake8 rev: 3.9.1 hooks: - id: flake8 args: - --ignore=E203,W503 - --max-complexity=25 - --max-line-length=88 # FIXME: fix mypy errors and then uncomment this # - repo: https://github.com/pre-commit/mirrors-mypy # rev: v0.782 # hooks: # - id: mypy # args: # - --ignore-missing-imports - repo: https://github.com/codespell-project/codespell rev: v2.0.0 hooks: - id: codespell args: - --ignore-words-list=ans,crate,fo,followings,hist,iff,mater,secant,som,tim - --skip="./.*,./strings/dictionary.txt,./strings/words.txt,./project_euler/problem_022/p022_names.txt" - --quiet-level=2 exclude: | (?x)^( strings/dictionary.txt | strings/words.txt | project_euler/problem_022/p022_names.txt )$ - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v3.4.0 hooks: - id: check-executables-have-shebangs - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace exclude: | (?x)^( data_structures/heap/binomial_heap.py )$ - id: requirements-txt-fixer - repo: https://github.com/psf/black rev: 21.4b0 hooks: - id: black - repo: https://github.com/PyCQA/isort rev: 5.8.0 hooks: - id: isort args: - --profile=black - repo: https://github.com/asottile/pyupgrade rev: v2.29.0 hooks: - id: pyupgrade args: - --py39-plus - repo: https://gitlab.com/pycqa/flake8 rev: 3.9.1 hooks: - id: flake8 args: - --ignore=E203,W503 - --max-complexity=25 - --max-line-length=88 # FIXME: fix mypy errors and then uncomment this # - repo: https://github.com/pre-commit/mirrors-mypy # rev: v0.782 # hooks: # - id: mypy # args: # - --ignore-missing-imports - repo: https://github.com/codespell-project/codespell rev: v2.0.0 hooks: - id: codespell args: - --ignore-words-list=ans,crate,fo,followings,hist,iff,mater,secant,som,tim - --skip="./.*,./strings/dictionary.txt,./strings/words.txt,./project_euler/problem_022/p022_names.txt" - --quiet-level=2 exclude: | (?x)^( strings/dictionary.txt | strings/words.txt | project_euler/problem_022/p022_names.txt )$ - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false
1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
## Arithmetic Analysis * [Bisection](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/bisection.py) * [Gaussian Elimination](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/in_static_equilibrium.py) * [Intersection](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/intersection.py) * [Lu Decomposition](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/newton_method.py) * [Newton Raphson](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/newton_raphson.py) * [Secant Method](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](https://github.com/TheAlgorithms/Python/blob/master/audio_filters/butterworth_filter.py) * [Iir Filter](https://github.com/TheAlgorithms/Python/blob/master/audio_filters/iir_filter.py) * [Show Response](https://github.com/TheAlgorithms/Python/blob/master/audio_filters/show_response.py) ## Backtracking * [All Combinations](https://github.com/TheAlgorithms/Python/blob/master/backtracking/all_combinations.py) * [All Permutations](https://github.com/TheAlgorithms/Python/blob/master/backtracking/all_permutations.py) * [All Subsequences](https://github.com/TheAlgorithms/Python/blob/master/backtracking/all_subsequences.py) * [Coloring](https://github.com/TheAlgorithms/Python/blob/master/backtracking/coloring.py) * [Hamiltonian Cycle](https://github.com/TheAlgorithms/Python/blob/master/backtracking/hamiltonian_cycle.py) * [Knight Tour](https://github.com/TheAlgorithms/Python/blob/master/backtracking/knight_tour.py) * [Minimax](https://github.com/TheAlgorithms/Python/blob/master/backtracking/minimax.py) * [N Queens](https://github.com/TheAlgorithms/Python/blob/master/backtracking/n_queens.py) * [N Queens Math](https://github.com/TheAlgorithms/Python/blob/master/backtracking/n_queens_math.py) * [Rat In Maze](https://github.com/TheAlgorithms/Python/blob/master/backtracking/rat_in_maze.py) * [Sudoku](https://github.com/TheAlgorithms/Python/blob/master/backtracking/sudoku.py) * [Sum Of Subsets](https://github.com/TheAlgorithms/Python/blob/master/backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_or_operator.py) * [Binary Shifts](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_shifts.py) * [Binary Twos Complement](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/count_number_of_one_bits.py) * [Reverse Bits](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](https://github.com/TheAlgorithms/Python/blob/master/blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](https://github.com/TheAlgorithms/Python/blob/master/blockchain/diophantine_equation.py) * [Modular Division](https://github.com/TheAlgorithms/Python/blob/master/blockchain/modular_division.py) ## Boolean Algebra * [Quine Mc Cluskey](https://github.com/TheAlgorithms/Python/blob/master/boolean_algebra/quine_mc_cluskey.py) ## Cellular Automata * [Conways Game Of Life](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/conways_game_of_life.py) * [Game Of Life](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/game_of_life.py) * [Nagel Schrekenberg](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/nagel_schrekenberg.py) * [One Dimensional](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](https://github.com/TheAlgorithms/Python/blob/master/ciphers/a1z26.py) * [Affine Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/affine_cipher.py) * [Atbash](https://github.com/TheAlgorithms/Python/blob/master/ciphers/atbash.py) * [Baconian Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/baconian_cipher.py) * [Base16](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base16.py) * [Base32](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base32.py) * [Base64 Encoding](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base64_encoding.py) * [Base85](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base85.py) * [Beaufort Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/beaufort_cipher.py) * [Bifid](https://github.com/TheAlgorithms/Python/blob/master/ciphers/bifid.py) * [Brute Force Caesar Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/caesar_cipher.py) * [Cryptomath Module](https://github.com/TheAlgorithms/Python/blob/master/ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](https://github.com/TheAlgorithms/Python/blob/master/ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](https://github.com/TheAlgorithms/Python/blob/master/ciphers/deterministic_miller_rabin.py) * [Diffie](https://github.com/TheAlgorithms/Python/blob/master/ciphers/diffie.py) * [Diffie Hellman](https://github.com/TheAlgorithms/Python/blob/master/ciphers/diffie_hellman.py) * [Elgamal Key Generator](https://github.com/TheAlgorithms/Python/blob/master/ciphers/elgamal_key_generator.py) * [Enigma Machine2](https://github.com/TheAlgorithms/Python/blob/master/ciphers/enigma_machine2.py) * [Hill Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/hill_cipher.py) * [Mixed Keyword Cypher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](https://github.com/TheAlgorithms/Python/blob/master/ciphers/mono_alphabetic_ciphers.py) * [Morse Code](https://github.com/TheAlgorithms/Python/blob/master/ciphers/morse_code.py) * [Onepad Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/onepad_cipher.py) * [Playfair Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/playfair_cipher.py) * [Polybius](https://github.com/TheAlgorithms/Python/blob/master/ciphers/polybius.py) * [Porta Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/porta_cipher.py) * [Rabin Miller](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rabin_miller.py) * [Rail Fence Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rail_fence_cipher.py) * [Rot13](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rot13.py) * [Rsa Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rsa_cipher.py) * [Rsa Factorization](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rsa_factorization.py) * [Rsa Key Generator](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/simple_substitution_cipher.py) * [Trafid Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/trafid_cipher.py) * [Transposition Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](https://github.com/TheAlgorithms/Python/blob/master/ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/vigenere_cipher.py) * [Xor Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](https://github.com/TheAlgorithms/Python/blob/master/compression/burrows_wheeler.py) * [Huffman](https://github.com/TheAlgorithms/Python/blob/master/compression/huffman.py) * [Lempel Ziv](https://github.com/TheAlgorithms/Python/blob/master/compression/lempel_ziv.py) * [Lempel Ziv Decompress](https://github.com/TheAlgorithms/Python/blob/master/compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](https://github.com/TheAlgorithms/Python/blob/master/compression/peak_signal_to_noise_ratio.py) ## Computer Vision * [Cnn Classification](https://github.com/TheAlgorithms/Python/blob/master/computer_vision/cnn_classification.py) * [Harris Corner](https://github.com/TheAlgorithms/Python/blob/master/computer_vision/harris_corner.py) * [Mean Threshold](https://github.com/TheAlgorithms/Python/blob/master/computer_vision/mean_threshold.py) ## Conversions * [Binary To Decimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/binary_to_decimal.py) * [Binary To Hexadecimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/binary_to_hexadecimal.py) * [Binary To Octal](https://github.com/TheAlgorithms/Python/blob/master/conversions/binary_to_octal.py) * [Decimal To Any](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_any.py) * [Decimal To Binary](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_octal.py) * [Hex To Bin](https://github.com/TheAlgorithms/Python/blob/master/conversions/hex_to_bin.py) * [Hexadecimal To Decimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/hexadecimal_to_decimal.py) * [Length Conversion](https://github.com/TheAlgorithms/Python/blob/master/conversions/length_conversion.py) * [Molecular Chemistry](https://github.com/TheAlgorithms/Python/blob/master/conversions/molecular_chemistry.py) * [Octal To Decimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/octal_to_decimal.py) * [Prefix Conversions](https://github.com/TheAlgorithms/Python/blob/master/conversions/prefix_conversions.py) * [Rgb Hsv Conversion](https://github.com/TheAlgorithms/Python/blob/master/conversions/rgb_hsv_conversion.py) * [Roman Numerals](https://github.com/TheAlgorithms/Python/blob/master/conversions/roman_numerals.py) * [Temperature Conversions](https://github.com/TheAlgorithms/Python/blob/master/conversions/temperature_conversions.py) * [Volume Conversions](https://github.com/TheAlgorithms/Python/blob/master/conversions/volume_conversions.py) * [Weight Conversion](https://github.com/TheAlgorithms/Python/blob/master/conversions/weight_conversion.py) ## Data Structures * Binary Tree * [Avl Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Traversals](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_tree_traversals.py) * [Fenwick Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/fenwick_tree.py) * [Lazy Segment Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/lowest_common_ancestor.py) * [Merge Two Binary Trees](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/red_black_tree.py) * [Segment Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/segment_tree_other.py) * [Treap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/treap.py) * [Wavelet Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](https://github.com/TheAlgorithms/Python/blob/master/data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](https://github.com/TheAlgorithms/Python/blob/master/data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/double_hash.py) * [Hash Table](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/hash_table.py) * [Hash Table With Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/binomial_heap.py) * [Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/heap.py) * [Heap Generic](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/heap_generic.py) * [Max Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/max_heap.py) * [Min Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/min_heap.py) * [Randomized Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/randomized_heap.py) * [Skew Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/from_sequence.py) * [Has Loop](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/has_loop.py) * [Is Palindrome](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/print_reverse.py) * [Singly Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/singly_linked_list.py) * [Skip List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/skip_list.py) * [Swap Nodes](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/circular_queue.py) * [Double Ended Queue](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/double_ended_queue.py) * [Linked Queue](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/linked_queue.py) * [Priority Queue Using List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/priority_queue_using_list.py) * [Queue On List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_prefix_conversion.py) * [Linked Stack](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/linked_stack.py) * [Next Greater Element](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/prefix_evaluation.py) * [Stack](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/stack.py) * [Stack Using Dll](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/stack_using_dll.py) * [Stock Span Problem](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/stock_span_problem.py) * Trie * [Trie](https://github.com/TheAlgorithms/Python/blob/master/data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/change_brightness.py) * [Change Contrast](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/change_contrast.py) * [Convert To Negative](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/bilateral_filter.py) * [Convolve](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/convolve.py) * [Gaussian Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/gaussian_filter.py) * [Median Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/median_filter.py) * [Sobel Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/resize/resize.py) * Rotation * [Rotation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/rotation/rotation.py) * [Sepia](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/sepia.py) * [Test Digital Image Processing](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/convex_hull.py) * [Heaps Algorithm](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/inversions.py) * [Kth Order Statistic](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/max_subarray_sum.py) * [Mergesort](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/mergesort.py) * [Peak](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/peak.py) * [Power](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/power.py) * [Strassen Matrix Multiplication](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/abbreviation.py) * [Bitmask](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/bitmask.py) * [Catalan Numbers](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/catalan_numbers.py) * [Climbing Stairs](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/climbing_stairs.py) * [Edit Distance](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/edit_distance.py) * [Factorial](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/factorial.py) * [Fast Fibonacci](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fast_fibonacci.py) * [Fibonacci](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fibonacci.py) * [Floyd Warshall](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/floyd_warshall.py) * [Fractional Knapsack](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fractional_knapsack.py) * [Fractional Knapsack 2](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fractional_knapsack_2.py) * [Integer Partition](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/integer_partition.py) * [Iterating Through Submasks](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/iterating_through_submasks.py) * [Knapsack](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/knapsack.py) * [Longest Common Subsequence](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_common_subsequence.py) * [Longest Increasing Subsequence](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_cost_path.py) * [Minimum Partition](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_partition.py) * [Minimum Steps To One](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/rod_cutting.py) * [Subset Generation](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/subset_generation.py) * [Sum Of Subset](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/sum_of_subset.py) ## Electronics * [Carrier Concentration](https://github.com/TheAlgorithms/Python/blob/master/electronics/carrier_concentration.py) * [Coulombs Law](https://github.com/TheAlgorithms/Python/blob/master/electronics/coulombs_law.py) * [Electric Power](https://github.com/TheAlgorithms/Python/blob/master/electronics/electric_power.py) * [Ohms Law](https://github.com/TheAlgorithms/Python/blob/master/electronics/ohms_law.py) ## File Transfer * [Receive File](https://github.com/TheAlgorithms/Python/blob/master/file_transfer/receive_file.py) * [Send File](https://github.com/TheAlgorithms/Python/blob/master/file_transfer/send_file.py) * Tests * [Test Send File](https://github.com/TheAlgorithms/Python/blob/master/file_transfer/tests/test_send_file.py) ## Financial * [Interest](https://github.com/TheAlgorithms/Python/blob/master/financial/interest.py) ## Fractals * [Julia Sets](https://github.com/TheAlgorithms/Python/blob/master/fractals/julia_sets.py) * [Koch Snowflake](https://github.com/TheAlgorithms/Python/blob/master/fractals/koch_snowflake.py) * [Mandelbrot](https://github.com/TheAlgorithms/Python/blob/master/fractals/mandelbrot.py) * [Sierpinski Triangle](https://github.com/TheAlgorithms/Python/blob/master/fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](https://github.com/TheAlgorithms/Python/blob/master/fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](https://github.com/TheAlgorithms/Python/blob/master/genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](https://github.com/TheAlgorithms/Python/blob/master/geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](https://github.com/TheAlgorithms/Python/blob/master/geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](https://github.com/TheAlgorithms/Python/blob/master/graphics/bezier_curve.py) * [Vector3 For 2D Rendering](https://github.com/TheAlgorithms/Python/blob/master/graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](https://github.com/TheAlgorithms/Python/blob/master/graphs/a_star.py) * [Articulation Points](https://github.com/TheAlgorithms/Python/blob/master/graphs/articulation_points.py) * [Basic Graphs](https://github.com/TheAlgorithms/Python/blob/master/graphs/basic_graphs.py) * [Bellman Ford](https://github.com/TheAlgorithms/Python/blob/master/graphs/bellman_ford.py) * [Bfs Shortest Path](https://github.com/TheAlgorithms/Python/blob/master/graphs/bfs_shortest_path.py) * [Bfs Zero One Shortest Path](https://github.com/TheAlgorithms/Python/blob/master/graphs/bfs_zero_one_shortest_path.py) * [Bidirectional A Star](https://github.com/TheAlgorithms/Python/blob/master/graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](https://github.com/TheAlgorithms/Python/blob/master/graphs/bidirectional_breadth_first_search.py) * [Boruvka](https://github.com/TheAlgorithms/Python/blob/master/graphs/boruvka.py) * [Breadth First Search](https://github.com/TheAlgorithms/Python/blob/master/graphs/breadth_first_search.py) * [Breadth First Search 2](https://github.com/TheAlgorithms/Python/blob/master/graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](https://github.com/TheAlgorithms/Python/blob/master/graphs/breadth_first_search_shortest_path.py) * [Check Bipartite Graph Bfs](https://github.com/TheAlgorithms/Python/blob/master/graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](https://github.com/TheAlgorithms/Python/blob/master/graphs/check_bipartite_graph_dfs.py) * [Check Cycle](https://github.com/TheAlgorithms/Python/blob/master/graphs/check_cycle.py) * [Connected Components](https://github.com/TheAlgorithms/Python/blob/master/graphs/connected_components.py) * [Depth First Search](https://github.com/TheAlgorithms/Python/blob/master/graphs/depth_first_search.py) * [Depth First Search 2](https://github.com/TheAlgorithms/Python/blob/master/graphs/depth_first_search_2.py) * [Dijkstra](https://github.com/TheAlgorithms/Python/blob/master/graphs/dijkstra.py) * [Dijkstra 2](https://github.com/TheAlgorithms/Python/blob/master/graphs/dijkstra_2.py) * [Dijkstra Algorithm](https://github.com/TheAlgorithms/Python/blob/master/graphs/dijkstra_algorithm.py) * [Dinic](https://github.com/TheAlgorithms/Python/blob/master/graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](https://github.com/TheAlgorithms/Python/blob/master/graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](https://github.com/TheAlgorithms/Python/blob/master/graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](https://github.com/TheAlgorithms/Python/blob/master/graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](https://github.com/TheAlgorithms/Python/blob/master/graphs/even_tree.py) * [Finding Bridges](https://github.com/TheAlgorithms/Python/blob/master/graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](https://github.com/TheAlgorithms/Python/blob/master/graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](https://github.com/TheAlgorithms/Python/blob/master/graphs/g_topological_sort.py) * [Gale Shapley Bigraph](https://github.com/TheAlgorithms/Python/blob/master/graphs/gale_shapley_bigraph.py) * [Graph List](https://github.com/TheAlgorithms/Python/blob/master/graphs/graph_list.py) * [Graph Matrix](https://github.com/TheAlgorithms/Python/blob/master/graphs/graph_matrix.py) * [Graphs Floyd Warshall](https://github.com/TheAlgorithms/Python/blob/master/graphs/graphs_floyd_warshall.py) * [Greedy Best First](https://github.com/TheAlgorithms/Python/blob/master/graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](https://github.com/TheAlgorithms/Python/blob/master/graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](https://github.com/TheAlgorithms/Python/blob/master/graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](https://github.com/TheAlgorithms/Python/blob/master/graphs/kahns_algorithm_topo.py) * [Karger](https://github.com/TheAlgorithms/Python/blob/master/graphs/karger.py) * [Markov Chain](https://github.com/TheAlgorithms/Python/blob/master/graphs/markov_chain.py) * [Matching Min Vertex Cover](https://github.com/TheAlgorithms/Python/blob/master/graphs/matching_min_vertex_cover.py) * [Minimum Spanning Tree Boruvka](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](https://github.com/TheAlgorithms/Python/blob/master/graphs/multi_heuristic_astar.py) * [Page Rank](https://github.com/TheAlgorithms/Python/blob/master/graphs/page_rank.py) * [Prim](https://github.com/TheAlgorithms/Python/blob/master/graphs/prim.py) * [Random Graph Generator](https://github.com/TheAlgorithms/Python/blob/master/graphs/random_graph_generator.py) * [Scc Kosaraju](https://github.com/TheAlgorithms/Python/blob/master/graphs/scc_kosaraju.py) * [Strongly Connected Components](https://github.com/TheAlgorithms/Python/blob/master/graphs/strongly_connected_components.py) * [Tarjans Scc](https://github.com/TheAlgorithms/Python/blob/master/graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](https://github.com/TheAlgorithms/Python/blob/master/graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](https://github.com/TheAlgorithms/Python/blob/master/graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Optimal Merge Pattern](https://github.com/TheAlgorithms/Python/blob/master/greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](https://github.com/TheAlgorithms/Python/blob/master/hashes/adler32.py) * [Chaos Machine](https://github.com/TheAlgorithms/Python/blob/master/hashes/chaos_machine.py) * [Djb2](https://github.com/TheAlgorithms/Python/blob/master/hashes/djb2.py) * [Enigma Machine](https://github.com/TheAlgorithms/Python/blob/master/hashes/enigma_machine.py) * [Hamming Code](https://github.com/TheAlgorithms/Python/blob/master/hashes/hamming_code.py) * [Luhn](https://github.com/TheAlgorithms/Python/blob/master/hashes/luhn.py) * [Md5](https://github.com/TheAlgorithms/Python/blob/master/hashes/md5.py) * [Sdbm](https://github.com/TheAlgorithms/Python/blob/master/hashes/sdbm.py) * [Sha1](https://github.com/TheAlgorithms/Python/blob/master/hashes/sha1.py) * [Sha256](https://github.com/TheAlgorithms/Python/blob/master/hashes/sha256.py) ## Knapsack * [Greedy Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/greedy_knapsack.py) * [Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/knapsack.py) * Tests * [Test Greedy Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/conjugate_gradient.py) * [Lib](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/lib.py) * [Polynom For Points](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/polynom_for_points.py) * [Power Iteration](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/schur_complement.py) * [Test Linear Algebra](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/astar.py) * [Data Transformations](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/data_transformations.py) * [Decision Tree](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/decision_tree.py) * Forecasting * [Run](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/gradient_descent.py) * [K Means Clust](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/k_means_clust.py) * [K Nearest Neighbours](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/linear_discriminant_analysis.py) * [Linear Regression](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/linear_regression.py) * [Logistic Regression](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/polymonial_regression.py) * [Random Forest Classifier](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/random_forest_classifier.py) * [Random Forest Regressor](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/random_forest_regressor.py) * [Scoring Functions](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/scoring_functions.py) * [Sequential Minimum Optimization](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/sequential_minimum_optimization.py) * [Similarity Search](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/similarity_search.py) * [Support Vector Machines](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/support_vector_machines.py) * [Word Frequency Functions](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/word_frequency_functions.py) ## Maths * [3N Plus 1](https://github.com/TheAlgorithms/Python/blob/master/maths/3n_plus_1.py) * [Abs](https://github.com/TheAlgorithms/Python/blob/master/maths/abs.py) * [Abs Max](https://github.com/TheAlgorithms/Python/blob/master/maths/abs_max.py) * [Abs Min](https://github.com/TheAlgorithms/Python/blob/master/maths/abs_min.py) * [Add](https://github.com/TheAlgorithms/Python/blob/master/maths/add.py) * [Aliquot Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/aliquot_sum.py) * [Allocation Number](https://github.com/TheAlgorithms/Python/blob/master/maths/allocation_number.py) * [Area](https://github.com/TheAlgorithms/Python/blob/master/maths/area.py) * [Area Under Curve](https://github.com/TheAlgorithms/Python/blob/master/maths/area_under_curve.py) * [Armstrong Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/armstrong_numbers.py) * [Average Mean](https://github.com/TheAlgorithms/Python/blob/master/maths/average_mean.py) * [Average Median](https://github.com/TheAlgorithms/Python/blob/master/maths/average_median.py) * [Average Mode](https://github.com/TheAlgorithms/Python/blob/master/maths/average_mode.py) * [Bailey Borwein Plouffe](https://github.com/TheAlgorithms/Python/blob/master/maths/bailey_borwein_plouffe.py) * [Basic Maths](https://github.com/TheAlgorithms/Python/blob/master/maths/basic_maths.py) * [Binary Exp Mod](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exp_mod.py) * [Binary Exponentiation](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exponentiation.py) * [Binary Exponentiation 2](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exponentiation_3.py) * [Binomial Coefficient](https://github.com/TheAlgorithms/Python/blob/master/maths/binomial_coefficient.py) * [Binomial Distribution](https://github.com/TheAlgorithms/Python/blob/master/maths/binomial_distribution.py) * [Bisection](https://github.com/TheAlgorithms/Python/blob/master/maths/bisection.py) * [Ceil](https://github.com/TheAlgorithms/Python/blob/master/maths/ceil.py) * [Check Polygon](https://github.com/TheAlgorithms/Python/blob/master/maths/check_polygon.py) * [Chudnovsky Algorithm](https://github.com/TheAlgorithms/Python/blob/master/maths/chudnovsky_algorithm.py) * [Collatz Sequence](https://github.com/TheAlgorithms/Python/blob/master/maths/collatz_sequence.py) * [Combinations](https://github.com/TheAlgorithms/Python/blob/master/maths/combinations.py) * [Decimal Isolate](https://github.com/TheAlgorithms/Python/blob/master/maths/decimal_isolate.py) * [Double Factorial Iterative](https://github.com/TheAlgorithms/Python/blob/master/maths/double_factorial_iterative.py) * [Double Factorial Recursive](https://github.com/TheAlgorithms/Python/blob/master/maths/double_factorial_recursive.py) * [Entropy](https://github.com/TheAlgorithms/Python/blob/master/maths/entropy.py) * [Euclidean Distance](https://github.com/TheAlgorithms/Python/blob/master/maths/euclidean_distance.py) * [Euclidean Gcd](https://github.com/TheAlgorithms/Python/blob/master/maths/euclidean_gcd.py) * [Euler Method](https://github.com/TheAlgorithms/Python/blob/master/maths/euler_method.py) * [Euler Modified](https://github.com/TheAlgorithms/Python/blob/master/maths/euler_modified.py) * [Eulers Totient](https://github.com/TheAlgorithms/Python/blob/master/maths/eulers_totient.py) * [Extended Euclidean Algorithm](https://github.com/TheAlgorithms/Python/blob/master/maths/extended_euclidean_algorithm.py) * [Factorial Iterative](https://github.com/TheAlgorithms/Python/blob/master/maths/factorial_iterative.py) * [Factorial Recursive](https://github.com/TheAlgorithms/Python/blob/master/maths/factorial_recursive.py) * [Factors](https://github.com/TheAlgorithms/Python/blob/master/maths/factors.py) * [Fermat Little Theorem](https://github.com/TheAlgorithms/Python/blob/master/maths/fermat_little_theorem.py) * [Fibonacci](https://github.com/TheAlgorithms/Python/blob/master/maths/fibonacci.py) * [Fibonacci Sequence Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/fibonacci_sequence_recursion.py) * [Find Max](https://github.com/TheAlgorithms/Python/blob/master/maths/find_max.py) * [Find Max Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/find_max_recursion.py) * [Find Min](https://github.com/TheAlgorithms/Python/blob/master/maths/find_min.py) * [Find Min Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/find_min_recursion.py) * [Floor](https://github.com/TheAlgorithms/Python/blob/master/maths/floor.py) * [Gamma](https://github.com/TheAlgorithms/Python/blob/master/maths/gamma.py) * [Gamma Recursive](https://github.com/TheAlgorithms/Python/blob/master/maths/gamma_recursive.py) * [Gaussian](https://github.com/TheAlgorithms/Python/blob/master/maths/gaussian.py) * [Greatest Common Divisor](https://github.com/TheAlgorithms/Python/blob/master/maths/greatest_common_divisor.py) * [Greedy Coin Change](https://github.com/TheAlgorithms/Python/blob/master/maths/greedy_coin_change.py) * [Hardy Ramanujanalgo](https://github.com/TheAlgorithms/Python/blob/master/maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](https://github.com/TheAlgorithms/Python/blob/master/maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](https://github.com/TheAlgorithms/Python/blob/master/maths/is_ip_v4_address_valid.py) * [Is Square Free](https://github.com/TheAlgorithms/Python/blob/master/maths/is_square_free.py) * [Jaccard Similarity](https://github.com/TheAlgorithms/Python/blob/master/maths/jaccard_similarity.py) * [Kadanes](https://github.com/TheAlgorithms/Python/blob/master/maths/kadanes.py) * [Karatsuba](https://github.com/TheAlgorithms/Python/blob/master/maths/karatsuba.py) * [Krishnamurthy Number](https://github.com/TheAlgorithms/Python/blob/master/maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](https://github.com/TheAlgorithms/Python/blob/master/maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/largest_subarray_sum.py) * [Least Common Multiple](https://github.com/TheAlgorithms/Python/blob/master/maths/least_common_multiple.py) * [Line Length](https://github.com/TheAlgorithms/Python/blob/master/maths/line_length.py) * [Lucas Lehmer Primality Test](https://github.com/TheAlgorithms/Python/blob/master/maths/lucas_lehmer_primality_test.py) * [Lucas Series](https://github.com/TheAlgorithms/Python/blob/master/maths/lucas_series.py) * [Matrix Exponentiation](https://github.com/TheAlgorithms/Python/blob/master/maths/matrix_exponentiation.py) * [Max Sum Sliding Window](https://github.com/TheAlgorithms/Python/blob/master/maths/max_sum_sliding_window.py) * [Median Of Two Arrays](https://github.com/TheAlgorithms/Python/blob/master/maths/median_of_two_arrays.py) * [Miller Rabin](https://github.com/TheAlgorithms/Python/blob/master/maths/miller_rabin.py) * [Mobius Function](https://github.com/TheAlgorithms/Python/blob/master/maths/mobius_function.py) * [Modular Exponential](https://github.com/TheAlgorithms/Python/blob/master/maths/modular_exponential.py) * [Monte Carlo](https://github.com/TheAlgorithms/Python/blob/master/maths/monte_carlo.py) * [Monte Carlo Dice](https://github.com/TheAlgorithms/Python/blob/master/maths/monte_carlo_dice.py) * [Newton Raphson](https://github.com/TheAlgorithms/Python/blob/master/maths/newton_raphson.py) * [Number Of Digits](https://github.com/TheAlgorithms/Python/blob/master/maths/number_of_digits.py) * [Numerical Integration](https://github.com/TheAlgorithms/Python/blob/master/maths/numerical_integration.py) * [Perfect Cube](https://github.com/TheAlgorithms/Python/blob/master/maths/perfect_cube.py) * [Perfect Number](https://github.com/TheAlgorithms/Python/blob/master/maths/perfect_number.py) * [Perfect Square](https://github.com/TheAlgorithms/Python/blob/master/maths/perfect_square.py) * [Pi Monte Carlo Estimation](https://github.com/TheAlgorithms/Python/blob/master/maths/pi_monte_carlo_estimation.py) * [Polynomial Evaluation](https://github.com/TheAlgorithms/Python/blob/master/maths/polynomial_evaluation.py) * [Power Using Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/power_using_recursion.py) * [Prime Check](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_check.py) * [Prime Factors](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_factors.py) * [Prime Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_numbers.py) * [Prime Sieve Eratosthenes](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_sieve_eratosthenes.py) * [Primelib](https://github.com/TheAlgorithms/Python/blob/master/maths/primelib.py) * [Proth Number](https://github.com/TheAlgorithms/Python/blob/master/maths/proth_number.py) * [Pythagoras](https://github.com/TheAlgorithms/Python/blob/master/maths/pythagoras.py) * [Qr Decomposition](https://github.com/TheAlgorithms/Python/blob/master/maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/quadratic_equations_complex_numbers.py) * [Radians](https://github.com/TheAlgorithms/Python/blob/master/maths/radians.py) * [Radix2 Fft](https://github.com/TheAlgorithms/Python/blob/master/maths/radix2_fft.py) * [Relu](https://github.com/TheAlgorithms/Python/blob/master/maths/relu.py) * [Runge Kutta](https://github.com/TheAlgorithms/Python/blob/master/maths/runge_kutta.py) * [Segmented Sieve](https://github.com/TheAlgorithms/Python/blob/master/maths/segmented_sieve.py) * Series * [Arithmetic](https://github.com/TheAlgorithms/Python/blob/master/maths/series/arithmetic.py) * [Geometric](https://github.com/TheAlgorithms/Python/blob/master/maths/series/geometric.py) * [Geometric Series](https://github.com/TheAlgorithms/Python/blob/master/maths/series/geometric_series.py) * [Harmonic](https://github.com/TheAlgorithms/Python/blob/master/maths/series/harmonic.py) * [Harmonic Series](https://github.com/TheAlgorithms/Python/blob/master/maths/series/harmonic_series.py) * [P Series](https://github.com/TheAlgorithms/Python/blob/master/maths/series/p_series.py) * [Sieve Of Eratosthenes](https://github.com/TheAlgorithms/Python/blob/master/maths/sieve_of_eratosthenes.py) * [Sigmoid](https://github.com/TheAlgorithms/Python/blob/master/maths/sigmoid.py) * [Simpson Rule](https://github.com/TheAlgorithms/Python/blob/master/maths/simpson_rule.py) * [Softmax](https://github.com/TheAlgorithms/Python/blob/master/maths/softmax.py) * [Square Root](https://github.com/TheAlgorithms/Python/blob/master/maths/square_root.py) * [Sum Of Arithmetic Series](https://github.com/TheAlgorithms/Python/blob/master/maths/sum_of_arithmetic_series.py) * [Sum Of Digits](https://github.com/TheAlgorithms/Python/blob/master/maths/sum_of_digits.py) * [Sum Of Geometric Progression](https://github.com/TheAlgorithms/Python/blob/master/maths/sum_of_geometric_progression.py) * [Sylvester Sequence](https://github.com/TheAlgorithms/Python/blob/master/maths/sylvester_sequence.py) * [Test Prime Check](https://github.com/TheAlgorithms/Python/blob/master/maths/test_prime_check.py) * [Trapezoidal Rule](https://github.com/TheAlgorithms/Python/blob/master/maths/trapezoidal_rule.py) * [Triplet Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/triplet_sum.py) * [Two Pointer](https://github.com/TheAlgorithms/Python/blob/master/maths/two_pointer.py) * [Two Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/two_sum.py) * [Ugly Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/ugly_numbers.py) * [Volume](https://github.com/TheAlgorithms/Python/blob/master/maths/volume.py) * [Zellers Congruence](https://github.com/TheAlgorithms/Python/blob/master/maths/zellers_congruence.py) ## Matrix * [Count Islands In Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/count_islands_in_matrix.py) * [Inverse Of Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/inverse_of_matrix.py) * [Matrix Class](https://github.com/TheAlgorithms/Python/blob/master/matrix/matrix_class.py) * [Matrix Operation](https://github.com/TheAlgorithms/Python/blob/master/matrix/matrix_operation.py) * [Nth Fibonacci Using Matrix Exponentiation](https://github.com/TheAlgorithms/Python/blob/master/matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/rotate_matrix.py) * [Searching In Sorted Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](https://github.com/TheAlgorithms/Python/blob/master/matrix/sherman_morrison.py) * [Spiral Print](https://github.com/TheAlgorithms/Python/blob/master/matrix/spiral_print.py) * Tests * [Test Matrix Operation](https://github.com/TheAlgorithms/Python/blob/master/matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](https://github.com/TheAlgorithms/Python/blob/master/networking_flow/ford_fulkerson.py) * [Minimum Cut](https://github.com/TheAlgorithms/Python/blob/master/networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](https://github.com/TheAlgorithms/Python/blob/master/neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](https://github.com/TheAlgorithms/Python/blob/master/neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](https://github.com/TheAlgorithms/Python/blob/master/neural_network/convolution_neural_network.py) * [Perceptron](https://github.com/TheAlgorithms/Python/blob/master/neural_network/perceptron.py) ## Other * [Activity Selection](https://github.com/TheAlgorithms/Python/blob/master/other/activity_selection.py) * [Check Strong Password](https://github.com/TheAlgorithms/Python/blob/master/other/check_strong_password.py) * [Date To Weekday](https://github.com/TheAlgorithms/Python/blob/master/other/date_to_weekday.py) * [Davisb Putnamb Logemannb Loveland](https://github.com/TheAlgorithms/Python/blob/master/other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](https://github.com/TheAlgorithms/Python/blob/master/other/dijkstra_bankers_algorithm.py) * [Doomsday](https://github.com/TheAlgorithms/Python/blob/master/other/doomsday.py) * [Fischer Yates Shuffle](https://github.com/TheAlgorithms/Python/blob/master/other/fischer_yates_shuffle.py) * [Gauss Easter](https://github.com/TheAlgorithms/Python/blob/master/other/gauss_easter.py) * [Graham Scan](https://github.com/TheAlgorithms/Python/blob/master/other/graham_scan.py) * [Greedy](https://github.com/TheAlgorithms/Python/blob/master/other/greedy.py) * [Least Recently Used](https://github.com/TheAlgorithms/Python/blob/master/other/least_recently_used.py) * [Lfu Cache](https://github.com/TheAlgorithms/Python/blob/master/other/lfu_cache.py) * [Linear Congruential Generator](https://github.com/TheAlgorithms/Python/blob/master/other/linear_congruential_generator.py) * [Lru Cache](https://github.com/TheAlgorithms/Python/blob/master/other/lru_cache.py) * [Magicdiamondpattern](https://github.com/TheAlgorithms/Python/blob/master/other/magicdiamondpattern.py) * [Nested Brackets](https://github.com/TheAlgorithms/Python/blob/master/other/nested_brackets.py) * [Password Generator](https://github.com/TheAlgorithms/Python/blob/master/other/password_generator.py) * [Scoring Algorithm](https://github.com/TheAlgorithms/Python/blob/master/other/scoring_algorithm.py) * [Sdes](https://github.com/TheAlgorithms/Python/blob/master/other/sdes.py) * [Tower Of Hanoi](https://github.com/TheAlgorithms/Python/blob/master/other/tower_of_hanoi.py) ## Physics * [N Body Simulation](https://github.com/TheAlgorithms/Python/blob/master/physics/n_body_simulation.py) ## Project Euler * Problem 001 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol4.py) * [Sol5](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol5.py) * [Sol6](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol6.py) * [Sol7](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol4.py) * [Sol5](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_003/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_003/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_004/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_005/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_007/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_007/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_008/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_008/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_009/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_009/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_010/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_010/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_011/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_012/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_014/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_016/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_017/sol1.py) * Problem 018 * [Solution](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_018/solution.py) * Problem 019 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_022/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_025/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_025/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_031/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_054/sol1.py) * [Test Poker Hand](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_067/sol1.py) * Problem 069 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_072/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_072/sol2.py) * Problem 074 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_074/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_102/sol1.py) * Problem 107 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_113/sol1.py) * Problem 119 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_144/sol1.py) * Problem 173 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_203/sol1.py) * Problem 206 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_301/sol1.py) * Problem 551 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_551/sol1.py) ## Quantum * [Deutsch Jozsa](https://github.com/TheAlgorithms/Python/blob/master/quantum/deutsch_jozsa.py) * [Half Adder](https://github.com/TheAlgorithms/Python/blob/master/quantum/half_adder.py) * [Not Gate](https://github.com/TheAlgorithms/Python/blob/master/quantum/not_gate.py) * [Quantum Entanglement](https://github.com/TheAlgorithms/Python/blob/master/quantum/quantum_entanglement.py) * [Ripple Adder Classic](https://github.com/TheAlgorithms/Python/blob/master/quantum/ripple_adder_classic.py) * [Single Qubit Measure](https://github.com/TheAlgorithms/Python/blob/master/quantum/single_qubit_measure.py) ## Scheduling * [First Come First Served](https://github.com/TheAlgorithms/Python/blob/master/scheduling/first_come_first_served.py) * [Round Robin](https://github.com/TheAlgorithms/Python/blob/master/scheduling/round_robin.py) * [Shortest Job First](https://github.com/TheAlgorithms/Python/blob/master/scheduling/shortest_job_first.py) ## Searches * [Binary Search](https://github.com/TheAlgorithms/Python/blob/master/searches/binary_search.py) * [Binary Tree Traversal](https://github.com/TheAlgorithms/Python/blob/master/searches/binary_tree_traversal.py) * [Double Linear Search](https://github.com/TheAlgorithms/Python/blob/master/searches/double_linear_search.py) * [Double Linear Search Recursion](https://github.com/TheAlgorithms/Python/blob/master/searches/double_linear_search_recursion.py) * [Fibonacci Search](https://github.com/TheAlgorithms/Python/blob/master/searches/fibonacci_search.py) * [Hill Climbing](https://github.com/TheAlgorithms/Python/blob/master/searches/hill_climbing.py) * [Interpolation Search](https://github.com/TheAlgorithms/Python/blob/master/searches/interpolation_search.py) * [Jump Search](https://github.com/TheAlgorithms/Python/blob/master/searches/jump_search.py) * [Linear Search](https://github.com/TheAlgorithms/Python/blob/master/searches/linear_search.py) * [Quick Select](https://github.com/TheAlgorithms/Python/blob/master/searches/quick_select.py) * [Sentinel Linear Search](https://github.com/TheAlgorithms/Python/blob/master/searches/sentinel_linear_search.py) * [Simple Binary Search](https://github.com/TheAlgorithms/Python/blob/master/searches/simple_binary_search.py) * [Simulated Annealing](https://github.com/TheAlgorithms/Python/blob/master/searches/simulated_annealing.py) * [Tabu Search](https://github.com/TheAlgorithms/Python/blob/master/searches/tabu_search.py) * [Ternary Search](https://github.com/TheAlgorithms/Python/blob/master/searches/ternary_search.py) ## Sorts * [Bead Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bead_sort.py) * [Bitonic Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bitonic_sort.py) * [Bogo Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bogo_sort.py) * [Bubble Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bubble_sort.py) * [Bucket Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bucket_sort.py) * [Cocktail Shaker Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/cocktail_shaker_sort.py) * [Comb Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/comb_sort.py) * [Counting Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/counting_sort.py) * [Cycle Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/cycle_sort.py) * [Double Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/double_sort.py) * [Dutch National Flag Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/dutch_national_flag_sort.py) * [Exchange Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/exchange_sort.py) * [External Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/external_sort.py) * [Gnome Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/gnome_sort.py) * [Heap Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/heap_sort.py) * [Insertion Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/insertion_sort.py) * [Intro Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/intro_sort.py) * [Iterative Merge Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/iterative_merge_sort.py) * [Merge Insertion Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/merge_insertion_sort.py) * [Merge Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/merge_sort.py) * [Msd Radix Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/msd_radix_sort.py) * [Natural Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/natural_sort.py) * [Odd Even Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](https://github.com/TheAlgorithms/Python/blob/master/sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](https://github.com/TheAlgorithms/Python/blob/master/sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/pancake_sort.py) * [Patience Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/patience_sort.py) * [Pigeon Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/pigeon_sort.py) * [Pigeonhole Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/pigeonhole_sort.py) * [Quick Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/quick_sort.py) * [Quick Sort 3 Partition](https://github.com/TheAlgorithms/Python/blob/master/sorts/quick_sort_3_partition.py) * [Radix Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/radix_sort.py) * [Random Normal Distribution Quicksort](https://github.com/TheAlgorithms/Python/blob/master/sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_quick_sort.py) * [Selection Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/selection_sort.py) * [Shell Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/shell_sort.py) * [Slowsort](https://github.com/TheAlgorithms/Python/blob/master/sorts/slowsort.py) * [Stooge Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/stooge_sort.py) * [Strand Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/strand_sort.py) * [Tim Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/tim_sort.py) * [Topological Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/topological_sort.py) * [Tree Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/tree_sort.py) * [Unknown Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/unknown_sort.py) * [Wiggle Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/wiggle_sort.py) ## Strings * [Aho Corasick](https://github.com/TheAlgorithms/Python/blob/master/strings/aho_corasick.py) * [Alternative String Arrange](https://github.com/TheAlgorithms/Python/blob/master/strings/alternative_string_arrange.py) * [Anagrams](https://github.com/TheAlgorithms/Python/blob/master/strings/anagrams.py) * [Autocomplete Using Trie](https://github.com/TheAlgorithms/Python/blob/master/strings/autocomplete_using_trie.py) * [Boyer Moore Search](https://github.com/TheAlgorithms/Python/blob/master/strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](https://github.com/TheAlgorithms/Python/blob/master/strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](https://github.com/TheAlgorithms/Python/blob/master/strings/capitalize.py) * [Check Anagrams](https://github.com/TheAlgorithms/Python/blob/master/strings/check_anagrams.py) * [Check Pangram](https://github.com/TheAlgorithms/Python/blob/master/strings/check_pangram.py) * [Credit Card Validator](https://github.com/TheAlgorithms/Python/blob/master/strings/credit_card_validator.py) * [Detecting English Programmatically](https://github.com/TheAlgorithms/Python/blob/master/strings/detecting_english_programmatically.py) * [Frequency Finder](https://github.com/TheAlgorithms/Python/blob/master/strings/frequency_finder.py) * [Indian Phone Validator](https://github.com/TheAlgorithms/Python/blob/master/strings/indian_phone_validator.py) * [Is Palindrome](https://github.com/TheAlgorithms/Python/blob/master/strings/is_palindrome.py) * [Jaro Winkler](https://github.com/TheAlgorithms/Python/blob/master/strings/jaro_winkler.py) * [Join](https://github.com/TheAlgorithms/Python/blob/master/strings/join.py) * [Knuth Morris Pratt](https://github.com/TheAlgorithms/Python/blob/master/strings/knuth_morris_pratt.py) * [Levenshtein Distance](https://github.com/TheAlgorithms/Python/blob/master/strings/levenshtein_distance.py) * [Lower](https://github.com/TheAlgorithms/Python/blob/master/strings/lower.py) * [Manacher](https://github.com/TheAlgorithms/Python/blob/master/strings/manacher.py) * [Min Cost String Conversion](https://github.com/TheAlgorithms/Python/blob/master/strings/min_cost_string_conversion.py) * [Naive String Search](https://github.com/TheAlgorithms/Python/blob/master/strings/naive_string_search.py) * [Palindrome](https://github.com/TheAlgorithms/Python/blob/master/strings/palindrome.py) * [Prefix Function](https://github.com/TheAlgorithms/Python/blob/master/strings/prefix_function.py) * [Rabin Karp](https://github.com/TheAlgorithms/Python/blob/master/strings/rabin_karp.py) * [Remove Duplicate](https://github.com/TheAlgorithms/Python/blob/master/strings/remove_duplicate.py) * [Reverse Letters](https://github.com/TheAlgorithms/Python/blob/master/strings/reverse_letters.py) * [Reverse Long Words](https://github.com/TheAlgorithms/Python/blob/master/strings/reverse_long_words.py) * [Reverse Words](https://github.com/TheAlgorithms/Python/blob/master/strings/reverse_words.py) * [Split](https://github.com/TheAlgorithms/Python/blob/master/strings/split.py) * [Upper](https://github.com/TheAlgorithms/Python/blob/master/strings/upper.py) * [Wildcard Pattern Matching](https://github.com/TheAlgorithms/Python/blob/master/strings/wildcard_pattern_matching.py) * [Word Occurrence](https://github.com/TheAlgorithms/Python/blob/master/strings/word_occurrence.py) * [Word Patterns](https://github.com/TheAlgorithms/Python/blob/master/strings/word_patterns.py) * [Z Function](https://github.com/TheAlgorithms/Python/blob/master/strings/z_function.py) ## Web Programming * [Co2 Emission](https://github.com/TheAlgorithms/Python/blob/master/web_programming/co2_emission.py) * [Covid Stats Via Xpath](https://github.com/TheAlgorithms/Python/blob/master/web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](https://github.com/TheAlgorithms/Python/blob/master/web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](https://github.com/TheAlgorithms/Python/blob/master/web_programming/crawl_google_scholar_citation.py) * [Currency Converter](https://github.com/TheAlgorithms/Python/blob/master/web_programming/currency_converter.py) * [Current Stock Price](https://github.com/TheAlgorithms/Python/blob/master/web_programming/current_stock_price.py) * [Current Weather](https://github.com/TheAlgorithms/Python/blob/master/web_programming/current_weather.py) * [Daily Horoscope](https://github.com/TheAlgorithms/Python/blob/master/web_programming/daily_horoscope.py) * [Download Images From Google Query](https://github.com/TheAlgorithms/Python/blob/master/web_programming/download_images_from_google_query.py) * [Emails From Url](https://github.com/TheAlgorithms/Python/blob/master/web_programming/emails_from_url.py) * [Fetch Bbc News](https://github.com/TheAlgorithms/Python/blob/master/web_programming/fetch_bbc_news.py) * [Fetch Github Info](https://github.com/TheAlgorithms/Python/blob/master/web_programming/fetch_github_info.py) * [Fetch Jobs](https://github.com/TheAlgorithms/Python/blob/master/web_programming/fetch_jobs.py) * [Get Imdb Top 250 Movies Csv](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_imdbtop.py) * [Get Top Hn Posts](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_top_hn_posts.py) * [Get User Tweets](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_user_tweets.py) * [Giphy](https://github.com/TheAlgorithms/Python/blob/master/web_programming/giphy.py) * [Instagram Crawler](https://github.com/TheAlgorithms/Python/blob/master/web_programming/instagram_crawler.py) * [Instagram Pic](https://github.com/TheAlgorithms/Python/blob/master/web_programming/instagram_pic.py) * [Instagram Video](https://github.com/TheAlgorithms/Python/blob/master/web_programming/instagram_video.py) * [Nasa Data](https://github.com/TheAlgorithms/Python/blob/master/web_programming/nasa_data.py) * [Random Anime Character](https://github.com/TheAlgorithms/Python/blob/master/web_programming/random_anime_character.py) * [Recaptcha Verification](https://github.com/TheAlgorithms/Python/blob/master/web_programming/recaptcha_verification.py) * [Slack Message](https://github.com/TheAlgorithms/Python/blob/master/web_programming/slack_message.py) * [Test Fetch Github Info](https://github.com/TheAlgorithms/Python/blob/master/web_programming/test_fetch_github_info.py) * [World Covid19 Stats](https://github.com/TheAlgorithms/Python/blob/master/web_programming/world_covid19_stats.py)
## Arithmetic Analysis * [Bisection](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/bisection.py) * [Gaussian Elimination](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/in_static_equilibrium.py) * [Intersection](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/intersection.py) * [Lu Decomposition](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/newton_method.py) * [Newton Raphson](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/newton_raphson.py) * [Secant Method](https://github.com/TheAlgorithms/Python/blob/master/arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](https://github.com/TheAlgorithms/Python/blob/master/audio_filters/butterworth_filter.py) * [Iir Filter](https://github.com/TheAlgorithms/Python/blob/master/audio_filters/iir_filter.py) * [Show Response](https://github.com/TheAlgorithms/Python/blob/master/audio_filters/show_response.py) ## Backtracking * [All Combinations](https://github.com/TheAlgorithms/Python/blob/master/backtracking/all_combinations.py) * [All Permutations](https://github.com/TheAlgorithms/Python/blob/master/backtracking/all_permutations.py) * [All Subsequences](https://github.com/TheAlgorithms/Python/blob/master/backtracking/all_subsequences.py) * [Coloring](https://github.com/TheAlgorithms/Python/blob/master/backtracking/coloring.py) * [Hamiltonian Cycle](https://github.com/TheAlgorithms/Python/blob/master/backtracking/hamiltonian_cycle.py) * [Knight Tour](https://github.com/TheAlgorithms/Python/blob/master/backtracking/knight_tour.py) * [Minimax](https://github.com/TheAlgorithms/Python/blob/master/backtracking/minimax.py) * [N Queens](https://github.com/TheAlgorithms/Python/blob/master/backtracking/n_queens.py) * [N Queens Math](https://github.com/TheAlgorithms/Python/blob/master/backtracking/n_queens_math.py) * [Rat In Maze](https://github.com/TheAlgorithms/Python/blob/master/backtracking/rat_in_maze.py) * [Sudoku](https://github.com/TheAlgorithms/Python/blob/master/backtracking/sudoku.py) * [Sum Of Subsets](https://github.com/TheAlgorithms/Python/blob/master/backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_or_operator.py) * [Binary Shifts](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_shifts.py) * [Binary Twos Complement](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/count_number_of_one_bits.py) * [Reverse Bits](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](https://github.com/TheAlgorithms/Python/blob/master/bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](https://github.com/TheAlgorithms/Python/blob/master/blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](https://github.com/TheAlgorithms/Python/blob/master/blockchain/diophantine_equation.py) * [Modular Division](https://github.com/TheAlgorithms/Python/blob/master/blockchain/modular_division.py) ## Boolean Algebra * [Quine Mc Cluskey](https://github.com/TheAlgorithms/Python/blob/master/boolean_algebra/quine_mc_cluskey.py) ## Cellular Automata * [Conways Game Of Life](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/conways_game_of_life.py) * [Game Of Life](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/game_of_life.py) * [Nagel Schrekenberg](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/nagel_schrekenberg.py) * [One Dimensional](https://github.com/TheAlgorithms/Python/blob/master/cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](https://github.com/TheAlgorithms/Python/blob/master/ciphers/a1z26.py) * [Affine Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/affine_cipher.py) * [Atbash](https://github.com/TheAlgorithms/Python/blob/master/ciphers/atbash.py) * [Baconian Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/baconian_cipher.py) * [Base16](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base16.py) * [Base32](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base32.py) * [Base64 Encoding](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base64_encoding.py) * [Base85](https://github.com/TheAlgorithms/Python/blob/master/ciphers/base85.py) * [Beaufort Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/beaufort_cipher.py) * [Bifid](https://github.com/TheAlgorithms/Python/blob/master/ciphers/bifid.py) * [Brute Force Caesar Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/caesar_cipher.py) * [Cryptomath Module](https://github.com/TheAlgorithms/Python/blob/master/ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](https://github.com/TheAlgorithms/Python/blob/master/ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](https://github.com/TheAlgorithms/Python/blob/master/ciphers/deterministic_miller_rabin.py) * [Diffie](https://github.com/TheAlgorithms/Python/blob/master/ciphers/diffie.py) * [Diffie Hellman](https://github.com/TheAlgorithms/Python/blob/master/ciphers/diffie_hellman.py) * [Elgamal Key Generator](https://github.com/TheAlgorithms/Python/blob/master/ciphers/elgamal_key_generator.py) * [Enigma Machine2](https://github.com/TheAlgorithms/Python/blob/master/ciphers/enigma_machine2.py) * [Hill Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/hill_cipher.py) * [Mixed Keyword Cypher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](https://github.com/TheAlgorithms/Python/blob/master/ciphers/mono_alphabetic_ciphers.py) * [Morse Code](https://github.com/TheAlgorithms/Python/blob/master/ciphers/morse_code.py) * [Onepad Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/onepad_cipher.py) * [Playfair Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/playfair_cipher.py) * [Polybius](https://github.com/TheAlgorithms/Python/blob/master/ciphers/polybius.py) * [Porta Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/porta_cipher.py) * [Rabin Miller](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rabin_miller.py) * [Rail Fence Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rail_fence_cipher.py) * [Rot13](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rot13.py) * [Rsa Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rsa_cipher.py) * [Rsa Factorization](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rsa_factorization.py) * [Rsa Key Generator](https://github.com/TheAlgorithms/Python/blob/master/ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/simple_substitution_cipher.py) * [Trafid Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/trafid_cipher.py) * [Transposition Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](https://github.com/TheAlgorithms/Python/blob/master/ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/vigenere_cipher.py) * [Xor Cipher](https://github.com/TheAlgorithms/Python/blob/master/ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](https://github.com/TheAlgorithms/Python/blob/master/compression/burrows_wheeler.py) * [Huffman](https://github.com/TheAlgorithms/Python/blob/master/compression/huffman.py) * [Lempel Ziv](https://github.com/TheAlgorithms/Python/blob/master/compression/lempel_ziv.py) * [Lempel Ziv Decompress](https://github.com/TheAlgorithms/Python/blob/master/compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](https://github.com/TheAlgorithms/Python/blob/master/compression/peak_signal_to_noise_ratio.py) ## Computer Vision * [Cnn Classification](https://github.com/TheAlgorithms/Python/blob/master/computer_vision/cnn_classification.py) * [Harris Corner](https://github.com/TheAlgorithms/Python/blob/master/computer_vision/harris_corner.py) * [Mean Threshold](https://github.com/TheAlgorithms/Python/blob/master/computer_vision/mean_threshold.py) ## Conversions * [Binary To Decimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/binary_to_decimal.py) * [Binary To Hexadecimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/binary_to_hexadecimal.py) * [Binary To Octal](https://github.com/TheAlgorithms/Python/blob/master/conversions/binary_to_octal.py) * [Decimal To Any](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_any.py) * [Decimal To Binary](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](https://github.com/TheAlgorithms/Python/blob/master/conversions/decimal_to_octal.py) * [Hex To Bin](https://github.com/TheAlgorithms/Python/blob/master/conversions/hex_to_bin.py) * [Hexadecimal To Decimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/hexadecimal_to_decimal.py) * [Length Conversion](https://github.com/TheAlgorithms/Python/blob/master/conversions/length_conversion.py) * [Molecular Chemistry](https://github.com/TheAlgorithms/Python/blob/master/conversions/molecular_chemistry.py) * [Octal To Decimal](https://github.com/TheAlgorithms/Python/blob/master/conversions/octal_to_decimal.py) * [Prefix Conversions](https://github.com/TheAlgorithms/Python/blob/master/conversions/prefix_conversions.py) * [Pressure Conversions](https://github.com/TheAlgorithms/Python/blob/master/conversions/pressure_conversions.py) * [Rgb Hsv Conversion](https://github.com/TheAlgorithms/Python/blob/master/conversions/rgb_hsv_conversion.py) * [Roman Numerals](https://github.com/TheAlgorithms/Python/blob/master/conversions/roman_numerals.py) * [Temperature Conversions](https://github.com/TheAlgorithms/Python/blob/master/conversions/temperature_conversions.py) * [Volume Conversions](https://github.com/TheAlgorithms/Python/blob/master/conversions/volume_conversions.py) * [Weight Conversion](https://github.com/TheAlgorithms/Python/blob/master/conversions/weight_conversion.py) ## Data Structures * Binary Tree * [Avl Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Traversals](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/binary_tree_traversals.py) * [Fenwick Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/fenwick_tree.py) * [Lazy Segment Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/lowest_common_ancestor.py) * [Merge Two Binary Trees](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/red_black_tree.py) * [Segment Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/segment_tree_other.py) * [Treap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/treap.py) * [Wavelet Tree](https://github.com/TheAlgorithms/Python/blob/master/data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](https://github.com/TheAlgorithms/Python/blob/master/data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](https://github.com/TheAlgorithms/Python/blob/master/data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/double_hash.py) * [Hash Table](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/hash_table.py) * [Hash Table With Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](https://github.com/TheAlgorithms/Python/blob/master/data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/binomial_heap.py) * [Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/heap.py) * [Heap Generic](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/heap_generic.py) * [Max Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/max_heap.py) * [Min Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/min_heap.py) * [Randomized Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/randomized_heap.py) * [Skew Heap](https://github.com/TheAlgorithms/Python/blob/master/data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/from_sequence.py) * [Has Loop](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/has_loop.py) * [Is Palindrome](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/print_reverse.py) * [Singly Linked List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/singly_linked_list.py) * [Skip List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/skip_list.py) * [Swap Nodes](https://github.com/TheAlgorithms/Python/blob/master/data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/circular_queue.py) * [Double Ended Queue](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/double_ended_queue.py) * [Linked Queue](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/linked_queue.py) * [Priority Queue Using List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/priority_queue_using_list.py) * [Queue On List](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](https://github.com/TheAlgorithms/Python/blob/master/data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/infix_to_prefix_conversion.py) * [Linked Stack](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/linked_stack.py) * [Next Greater Element](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/prefix_evaluation.py) * [Stack](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/stack.py) * [Stack Using Dll](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/stack_using_dll.py) * [Stock Span Problem](https://github.com/TheAlgorithms/Python/blob/master/data_structures/stacks/stock_span_problem.py) * Trie * [Trie](https://github.com/TheAlgorithms/Python/blob/master/data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/change_brightness.py) * [Change Contrast](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/change_contrast.py) * [Convert To Negative](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/bilateral_filter.py) * [Convolve](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/convolve.py) * [Gaussian Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/gaussian_filter.py) * [Median Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/median_filter.py) * [Sobel Filter](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/resize/resize.py) * Rotation * [Rotation](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/rotation/rotation.py) * [Sepia](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/sepia.py) * [Test Digital Image Processing](https://github.com/TheAlgorithms/Python/blob/master/digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/convex_hull.py) * [Heaps Algorithm](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/inversions.py) * [Kth Order Statistic](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/max_subarray_sum.py) * [Mergesort](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/mergesort.py) * [Peak](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/peak.py) * [Power](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/power.py) * [Strassen Matrix Multiplication](https://github.com/TheAlgorithms/Python/blob/master/divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/abbreviation.py) * [Bitmask](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/bitmask.py) * [Catalan Numbers](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/catalan_numbers.py) * [Climbing Stairs](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/climbing_stairs.py) * [Edit Distance](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/edit_distance.py) * [Factorial](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/factorial.py) * [Fast Fibonacci](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fast_fibonacci.py) * [Fibonacci](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fibonacci.py) * [Floyd Warshall](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/floyd_warshall.py) * [Fractional Knapsack](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fractional_knapsack.py) * [Fractional Knapsack 2](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/fractional_knapsack_2.py) * [Integer Partition](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/integer_partition.py) * [Iterating Through Submasks](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/iterating_through_submasks.py) * [Knapsack](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/knapsack.py) * [Longest Common Subsequence](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_common_subsequence.py) * [Longest Increasing Subsequence](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_cost_path.py) * [Minimum Partition](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_partition.py) * [Minimum Steps To One](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/rod_cutting.py) * [Subset Generation](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/subset_generation.py) * [Sum Of Subset](https://github.com/TheAlgorithms/Python/blob/master/dynamic_programming/sum_of_subset.py) ## Electronics * [Carrier Concentration](https://github.com/TheAlgorithms/Python/blob/master/electronics/carrier_concentration.py) * [Coulombs Law](https://github.com/TheAlgorithms/Python/blob/master/electronics/coulombs_law.py) * [Electric Power](https://github.com/TheAlgorithms/Python/blob/master/electronics/electric_power.py) * [Ohms Law](https://github.com/TheAlgorithms/Python/blob/master/electronics/ohms_law.py) ## File Transfer * [Receive File](https://github.com/TheAlgorithms/Python/blob/master/file_transfer/receive_file.py) * [Send File](https://github.com/TheAlgorithms/Python/blob/master/file_transfer/send_file.py) * Tests * [Test Send File](https://github.com/TheAlgorithms/Python/blob/master/file_transfer/tests/test_send_file.py) ## Financial * [Interest](https://github.com/TheAlgorithms/Python/blob/master/financial/interest.py) ## Fractals * [Julia Sets](https://github.com/TheAlgorithms/Python/blob/master/fractals/julia_sets.py) * [Koch Snowflake](https://github.com/TheAlgorithms/Python/blob/master/fractals/koch_snowflake.py) * [Mandelbrot](https://github.com/TheAlgorithms/Python/blob/master/fractals/mandelbrot.py) * [Sierpinski Triangle](https://github.com/TheAlgorithms/Python/blob/master/fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](https://github.com/TheAlgorithms/Python/blob/master/fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](https://github.com/TheAlgorithms/Python/blob/master/genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](https://github.com/TheAlgorithms/Python/blob/master/geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](https://github.com/TheAlgorithms/Python/blob/master/geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](https://github.com/TheAlgorithms/Python/blob/master/graphics/bezier_curve.py) * [Vector3 For 2D Rendering](https://github.com/TheAlgorithms/Python/blob/master/graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](https://github.com/TheAlgorithms/Python/blob/master/graphs/a_star.py) * [Articulation Points](https://github.com/TheAlgorithms/Python/blob/master/graphs/articulation_points.py) * [Basic Graphs](https://github.com/TheAlgorithms/Python/blob/master/graphs/basic_graphs.py) * [Bellman Ford](https://github.com/TheAlgorithms/Python/blob/master/graphs/bellman_ford.py) * [Bfs Shortest Path](https://github.com/TheAlgorithms/Python/blob/master/graphs/bfs_shortest_path.py) * [Bfs Zero One Shortest Path](https://github.com/TheAlgorithms/Python/blob/master/graphs/bfs_zero_one_shortest_path.py) * [Bidirectional A Star](https://github.com/TheAlgorithms/Python/blob/master/graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](https://github.com/TheAlgorithms/Python/blob/master/graphs/bidirectional_breadth_first_search.py) * [Boruvka](https://github.com/TheAlgorithms/Python/blob/master/graphs/boruvka.py) * [Breadth First Search](https://github.com/TheAlgorithms/Python/blob/master/graphs/breadth_first_search.py) * [Breadth First Search 2](https://github.com/TheAlgorithms/Python/blob/master/graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](https://github.com/TheAlgorithms/Python/blob/master/graphs/breadth_first_search_shortest_path.py) * [Check Bipartite Graph Bfs](https://github.com/TheAlgorithms/Python/blob/master/graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](https://github.com/TheAlgorithms/Python/blob/master/graphs/check_bipartite_graph_dfs.py) * [Check Cycle](https://github.com/TheAlgorithms/Python/blob/master/graphs/check_cycle.py) * [Connected Components](https://github.com/TheAlgorithms/Python/blob/master/graphs/connected_components.py) * [Depth First Search](https://github.com/TheAlgorithms/Python/blob/master/graphs/depth_first_search.py) * [Depth First Search 2](https://github.com/TheAlgorithms/Python/blob/master/graphs/depth_first_search_2.py) * [Dijkstra](https://github.com/TheAlgorithms/Python/blob/master/graphs/dijkstra.py) * [Dijkstra 2](https://github.com/TheAlgorithms/Python/blob/master/graphs/dijkstra_2.py) * [Dijkstra Algorithm](https://github.com/TheAlgorithms/Python/blob/master/graphs/dijkstra_algorithm.py) * [Dinic](https://github.com/TheAlgorithms/Python/blob/master/graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](https://github.com/TheAlgorithms/Python/blob/master/graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](https://github.com/TheAlgorithms/Python/blob/master/graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](https://github.com/TheAlgorithms/Python/blob/master/graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](https://github.com/TheAlgorithms/Python/blob/master/graphs/even_tree.py) * [Finding Bridges](https://github.com/TheAlgorithms/Python/blob/master/graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](https://github.com/TheAlgorithms/Python/blob/master/graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](https://github.com/TheAlgorithms/Python/blob/master/graphs/g_topological_sort.py) * [Gale Shapley Bigraph](https://github.com/TheAlgorithms/Python/blob/master/graphs/gale_shapley_bigraph.py) * [Graph List](https://github.com/TheAlgorithms/Python/blob/master/graphs/graph_list.py) * [Graph Matrix](https://github.com/TheAlgorithms/Python/blob/master/graphs/graph_matrix.py) * [Graphs Floyd Warshall](https://github.com/TheAlgorithms/Python/blob/master/graphs/graphs_floyd_warshall.py) * [Greedy Best First](https://github.com/TheAlgorithms/Python/blob/master/graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](https://github.com/TheAlgorithms/Python/blob/master/graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](https://github.com/TheAlgorithms/Python/blob/master/graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](https://github.com/TheAlgorithms/Python/blob/master/graphs/kahns_algorithm_topo.py) * [Karger](https://github.com/TheAlgorithms/Python/blob/master/graphs/karger.py) * [Markov Chain](https://github.com/TheAlgorithms/Python/blob/master/graphs/markov_chain.py) * [Matching Min Vertex Cover](https://github.com/TheAlgorithms/Python/blob/master/graphs/matching_min_vertex_cover.py) * [Minimum Spanning Tree Boruvka](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](https://github.com/TheAlgorithms/Python/blob/master/graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](https://github.com/TheAlgorithms/Python/blob/master/graphs/multi_heuristic_astar.py) * [Page Rank](https://github.com/TheAlgorithms/Python/blob/master/graphs/page_rank.py) * [Prim](https://github.com/TheAlgorithms/Python/blob/master/graphs/prim.py) * [Random Graph Generator](https://github.com/TheAlgorithms/Python/blob/master/graphs/random_graph_generator.py) * [Scc Kosaraju](https://github.com/TheAlgorithms/Python/blob/master/graphs/scc_kosaraju.py) * [Strongly Connected Components](https://github.com/TheAlgorithms/Python/blob/master/graphs/strongly_connected_components.py) * [Tarjans Scc](https://github.com/TheAlgorithms/Python/blob/master/graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](https://github.com/TheAlgorithms/Python/blob/master/graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](https://github.com/TheAlgorithms/Python/blob/master/graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Optimal Merge Pattern](https://github.com/TheAlgorithms/Python/blob/master/greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](https://github.com/TheAlgorithms/Python/blob/master/hashes/adler32.py) * [Chaos Machine](https://github.com/TheAlgorithms/Python/blob/master/hashes/chaos_machine.py) * [Djb2](https://github.com/TheAlgorithms/Python/blob/master/hashes/djb2.py) * [Enigma Machine](https://github.com/TheAlgorithms/Python/blob/master/hashes/enigma_machine.py) * [Hamming Code](https://github.com/TheAlgorithms/Python/blob/master/hashes/hamming_code.py) * [Luhn](https://github.com/TheAlgorithms/Python/blob/master/hashes/luhn.py) * [Md5](https://github.com/TheAlgorithms/Python/blob/master/hashes/md5.py) * [Sdbm](https://github.com/TheAlgorithms/Python/blob/master/hashes/sdbm.py) * [Sha1](https://github.com/TheAlgorithms/Python/blob/master/hashes/sha1.py) * [Sha256](https://github.com/TheAlgorithms/Python/blob/master/hashes/sha256.py) ## Knapsack * [Greedy Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/greedy_knapsack.py) * [Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/knapsack.py) * Tests * [Test Greedy Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](https://github.com/TheAlgorithms/Python/blob/master/knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/conjugate_gradient.py) * [Lib](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/lib.py) * [Polynom For Points](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/polynom_for_points.py) * [Power Iteration](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/schur_complement.py) * [Test Linear Algebra](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](https://github.com/TheAlgorithms/Python/blob/master/linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/astar.py) * [Data Transformations](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/data_transformations.py) * [Decision Tree](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/decision_tree.py) * Forecasting * [Run](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/gradient_descent.py) * [K Means Clust](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/k_means_clust.py) * [K Nearest Neighbours](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/linear_discriminant_analysis.py) * [Linear Regression](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/linear_regression.py) * [Logistic Regression](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/polymonial_regression.py) * [Random Forest Classifier](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/random_forest_classifier.py) * [Random Forest Regressor](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/random_forest_regressor.py) * [Scoring Functions](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/scoring_functions.py) * [Sequential Minimum Optimization](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/sequential_minimum_optimization.py) * [Similarity Search](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/similarity_search.py) * [Support Vector Machines](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/support_vector_machines.py) * [Word Frequency Functions](https://github.com/TheAlgorithms/Python/blob/master/machine_learning/word_frequency_functions.py) ## Maths * [3N Plus 1](https://github.com/TheAlgorithms/Python/blob/master/maths/3n_plus_1.py) * [Abs](https://github.com/TheAlgorithms/Python/blob/master/maths/abs.py) * [Abs Max](https://github.com/TheAlgorithms/Python/blob/master/maths/abs_max.py) * [Abs Min](https://github.com/TheAlgorithms/Python/blob/master/maths/abs_min.py) * [Add](https://github.com/TheAlgorithms/Python/blob/master/maths/add.py) * [Aliquot Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/aliquot_sum.py) * [Allocation Number](https://github.com/TheAlgorithms/Python/blob/master/maths/allocation_number.py) * [Area](https://github.com/TheAlgorithms/Python/blob/master/maths/area.py) * [Area Under Curve](https://github.com/TheAlgorithms/Python/blob/master/maths/area_under_curve.py) * [Armstrong Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/armstrong_numbers.py) * [Average Mean](https://github.com/TheAlgorithms/Python/blob/master/maths/average_mean.py) * [Average Median](https://github.com/TheAlgorithms/Python/blob/master/maths/average_median.py) * [Average Mode](https://github.com/TheAlgorithms/Python/blob/master/maths/average_mode.py) * [Bailey Borwein Plouffe](https://github.com/TheAlgorithms/Python/blob/master/maths/bailey_borwein_plouffe.py) * [Basic Maths](https://github.com/TheAlgorithms/Python/blob/master/maths/basic_maths.py) * [Binary Exp Mod](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exp_mod.py) * [Binary Exponentiation](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exponentiation.py) * [Binary Exponentiation 2](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](https://github.com/TheAlgorithms/Python/blob/master/maths/binary_exponentiation_3.py) * [Binomial Coefficient](https://github.com/TheAlgorithms/Python/blob/master/maths/binomial_coefficient.py) * [Binomial Distribution](https://github.com/TheAlgorithms/Python/blob/master/maths/binomial_distribution.py) * [Bisection](https://github.com/TheAlgorithms/Python/blob/master/maths/bisection.py) * [Ceil](https://github.com/TheAlgorithms/Python/blob/master/maths/ceil.py) * [Check Polygon](https://github.com/TheAlgorithms/Python/blob/master/maths/check_polygon.py) * [Chudnovsky Algorithm](https://github.com/TheAlgorithms/Python/blob/master/maths/chudnovsky_algorithm.py) * [Collatz Sequence](https://github.com/TheAlgorithms/Python/blob/master/maths/collatz_sequence.py) * [Combinations](https://github.com/TheAlgorithms/Python/blob/master/maths/combinations.py) * [Decimal Isolate](https://github.com/TheAlgorithms/Python/blob/master/maths/decimal_isolate.py) * [Double Factorial Iterative](https://github.com/TheAlgorithms/Python/blob/master/maths/double_factorial_iterative.py) * [Double Factorial Recursive](https://github.com/TheAlgorithms/Python/blob/master/maths/double_factorial_recursive.py) * [Entropy](https://github.com/TheAlgorithms/Python/blob/master/maths/entropy.py) * [Euclidean Distance](https://github.com/TheAlgorithms/Python/blob/master/maths/euclidean_distance.py) * [Euclidean Gcd](https://github.com/TheAlgorithms/Python/blob/master/maths/euclidean_gcd.py) * [Euler Method](https://github.com/TheAlgorithms/Python/blob/master/maths/euler_method.py) * [Euler Modified](https://github.com/TheAlgorithms/Python/blob/master/maths/euler_modified.py) * [Eulers Totient](https://github.com/TheAlgorithms/Python/blob/master/maths/eulers_totient.py) * [Extended Euclidean Algorithm](https://github.com/TheAlgorithms/Python/blob/master/maths/extended_euclidean_algorithm.py) * [Factorial Iterative](https://github.com/TheAlgorithms/Python/blob/master/maths/factorial_iterative.py) * [Factorial Recursive](https://github.com/TheAlgorithms/Python/blob/master/maths/factorial_recursive.py) * [Factors](https://github.com/TheAlgorithms/Python/blob/master/maths/factors.py) * [Fermat Little Theorem](https://github.com/TheAlgorithms/Python/blob/master/maths/fermat_little_theorem.py) * [Fibonacci](https://github.com/TheAlgorithms/Python/blob/master/maths/fibonacci.py) * [Fibonacci Sequence Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/fibonacci_sequence_recursion.py) * [Find Max](https://github.com/TheAlgorithms/Python/blob/master/maths/find_max.py) * [Find Max Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/find_max_recursion.py) * [Find Min](https://github.com/TheAlgorithms/Python/blob/master/maths/find_min.py) * [Find Min Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/find_min_recursion.py) * [Floor](https://github.com/TheAlgorithms/Python/blob/master/maths/floor.py) * [Gamma](https://github.com/TheAlgorithms/Python/blob/master/maths/gamma.py) * [Gamma Recursive](https://github.com/TheAlgorithms/Python/blob/master/maths/gamma_recursive.py) * [Gaussian](https://github.com/TheAlgorithms/Python/blob/master/maths/gaussian.py) * [Greatest Common Divisor](https://github.com/TheAlgorithms/Python/blob/master/maths/greatest_common_divisor.py) * [Greedy Coin Change](https://github.com/TheAlgorithms/Python/blob/master/maths/greedy_coin_change.py) * [Hardy Ramanujanalgo](https://github.com/TheAlgorithms/Python/blob/master/maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](https://github.com/TheAlgorithms/Python/blob/master/maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](https://github.com/TheAlgorithms/Python/blob/master/maths/is_ip_v4_address_valid.py) * [Is Square Free](https://github.com/TheAlgorithms/Python/blob/master/maths/is_square_free.py) * [Jaccard Similarity](https://github.com/TheAlgorithms/Python/blob/master/maths/jaccard_similarity.py) * [Kadanes](https://github.com/TheAlgorithms/Python/blob/master/maths/kadanes.py) * [Karatsuba](https://github.com/TheAlgorithms/Python/blob/master/maths/karatsuba.py) * [Krishnamurthy Number](https://github.com/TheAlgorithms/Python/blob/master/maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](https://github.com/TheAlgorithms/Python/blob/master/maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/largest_subarray_sum.py) * [Least Common Multiple](https://github.com/TheAlgorithms/Python/blob/master/maths/least_common_multiple.py) * [Line Length](https://github.com/TheAlgorithms/Python/blob/master/maths/line_length.py) * [Lucas Lehmer Primality Test](https://github.com/TheAlgorithms/Python/blob/master/maths/lucas_lehmer_primality_test.py) * [Lucas Series](https://github.com/TheAlgorithms/Python/blob/master/maths/lucas_series.py) * [Matrix Exponentiation](https://github.com/TheAlgorithms/Python/blob/master/maths/matrix_exponentiation.py) * [Max Sum Sliding Window](https://github.com/TheAlgorithms/Python/blob/master/maths/max_sum_sliding_window.py) * [Median Of Two Arrays](https://github.com/TheAlgorithms/Python/blob/master/maths/median_of_two_arrays.py) * [Miller Rabin](https://github.com/TheAlgorithms/Python/blob/master/maths/miller_rabin.py) * [Mobius Function](https://github.com/TheAlgorithms/Python/blob/master/maths/mobius_function.py) * [Modular Exponential](https://github.com/TheAlgorithms/Python/blob/master/maths/modular_exponential.py) * [Monte Carlo](https://github.com/TheAlgorithms/Python/blob/master/maths/monte_carlo.py) * [Monte Carlo Dice](https://github.com/TheAlgorithms/Python/blob/master/maths/monte_carlo_dice.py) * [Newton Raphson](https://github.com/TheAlgorithms/Python/blob/master/maths/newton_raphson.py) * [Number Of Digits](https://github.com/TheAlgorithms/Python/blob/master/maths/number_of_digits.py) * [Numerical Integration](https://github.com/TheAlgorithms/Python/blob/master/maths/numerical_integration.py) * [Perfect Cube](https://github.com/TheAlgorithms/Python/blob/master/maths/perfect_cube.py) * [Perfect Number](https://github.com/TheAlgorithms/Python/blob/master/maths/perfect_number.py) * [Perfect Square](https://github.com/TheAlgorithms/Python/blob/master/maths/perfect_square.py) * [Pi Monte Carlo Estimation](https://github.com/TheAlgorithms/Python/blob/master/maths/pi_monte_carlo_estimation.py) * [Polynomial Evaluation](https://github.com/TheAlgorithms/Python/blob/master/maths/polynomial_evaluation.py) * [Power Using Recursion](https://github.com/TheAlgorithms/Python/blob/master/maths/power_using_recursion.py) * [Prime Check](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_check.py) * [Prime Factors](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_factors.py) * [Prime Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_numbers.py) * [Prime Sieve Eratosthenes](https://github.com/TheAlgorithms/Python/blob/master/maths/prime_sieve_eratosthenes.py) * [Primelib](https://github.com/TheAlgorithms/Python/blob/master/maths/primelib.py) * [Proth Number](https://github.com/TheAlgorithms/Python/blob/master/maths/proth_number.py) * [Pythagoras](https://github.com/TheAlgorithms/Python/blob/master/maths/pythagoras.py) * [Qr Decomposition](https://github.com/TheAlgorithms/Python/blob/master/maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/quadratic_equations_complex_numbers.py) * [Radians](https://github.com/TheAlgorithms/Python/blob/master/maths/radians.py) * [Radix2 Fft](https://github.com/TheAlgorithms/Python/blob/master/maths/radix2_fft.py) * [Relu](https://github.com/TheAlgorithms/Python/blob/master/maths/relu.py) * [Runge Kutta](https://github.com/TheAlgorithms/Python/blob/master/maths/runge_kutta.py) * [Segmented Sieve](https://github.com/TheAlgorithms/Python/blob/master/maths/segmented_sieve.py) * Series * [Arithmetic](https://github.com/TheAlgorithms/Python/blob/master/maths/series/arithmetic.py) * [Geometric](https://github.com/TheAlgorithms/Python/blob/master/maths/series/geometric.py) * [Geometric Series](https://github.com/TheAlgorithms/Python/blob/master/maths/series/geometric_series.py) * [Harmonic](https://github.com/TheAlgorithms/Python/blob/master/maths/series/harmonic.py) * [Harmonic Series](https://github.com/TheAlgorithms/Python/blob/master/maths/series/harmonic_series.py) * [P Series](https://github.com/TheAlgorithms/Python/blob/master/maths/series/p_series.py) * [Sieve Of Eratosthenes](https://github.com/TheAlgorithms/Python/blob/master/maths/sieve_of_eratosthenes.py) * [Sigmoid](https://github.com/TheAlgorithms/Python/blob/master/maths/sigmoid.py) * [Simpson Rule](https://github.com/TheAlgorithms/Python/blob/master/maths/simpson_rule.py) * [Softmax](https://github.com/TheAlgorithms/Python/blob/master/maths/softmax.py) * [Square Root](https://github.com/TheAlgorithms/Python/blob/master/maths/square_root.py) * [Sum Of Arithmetic Series](https://github.com/TheAlgorithms/Python/blob/master/maths/sum_of_arithmetic_series.py) * [Sum Of Digits](https://github.com/TheAlgorithms/Python/blob/master/maths/sum_of_digits.py) * [Sum Of Geometric Progression](https://github.com/TheAlgorithms/Python/blob/master/maths/sum_of_geometric_progression.py) * [Sylvester Sequence](https://github.com/TheAlgorithms/Python/blob/master/maths/sylvester_sequence.py) * [Test Prime Check](https://github.com/TheAlgorithms/Python/blob/master/maths/test_prime_check.py) * [Trapezoidal Rule](https://github.com/TheAlgorithms/Python/blob/master/maths/trapezoidal_rule.py) * [Triplet Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/triplet_sum.py) * [Two Pointer](https://github.com/TheAlgorithms/Python/blob/master/maths/two_pointer.py) * [Two Sum](https://github.com/TheAlgorithms/Python/blob/master/maths/two_sum.py) * [Ugly Numbers](https://github.com/TheAlgorithms/Python/blob/master/maths/ugly_numbers.py) * [Volume](https://github.com/TheAlgorithms/Python/blob/master/maths/volume.py) * [Zellers Congruence](https://github.com/TheAlgorithms/Python/blob/master/maths/zellers_congruence.py) ## Matrix * [Count Islands In Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/count_islands_in_matrix.py) * [Inverse Of Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/inverse_of_matrix.py) * [Matrix Class](https://github.com/TheAlgorithms/Python/blob/master/matrix/matrix_class.py) * [Matrix Operation](https://github.com/TheAlgorithms/Python/blob/master/matrix/matrix_operation.py) * [Nth Fibonacci Using Matrix Exponentiation](https://github.com/TheAlgorithms/Python/blob/master/matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/rotate_matrix.py) * [Searching In Sorted Matrix](https://github.com/TheAlgorithms/Python/blob/master/matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](https://github.com/TheAlgorithms/Python/blob/master/matrix/sherman_morrison.py) * [Spiral Print](https://github.com/TheAlgorithms/Python/blob/master/matrix/spiral_print.py) * Tests * [Test Matrix Operation](https://github.com/TheAlgorithms/Python/blob/master/matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](https://github.com/TheAlgorithms/Python/blob/master/networking_flow/ford_fulkerson.py) * [Minimum Cut](https://github.com/TheAlgorithms/Python/blob/master/networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](https://github.com/TheAlgorithms/Python/blob/master/neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](https://github.com/TheAlgorithms/Python/blob/master/neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](https://github.com/TheAlgorithms/Python/blob/master/neural_network/convolution_neural_network.py) * [Perceptron](https://github.com/TheAlgorithms/Python/blob/master/neural_network/perceptron.py) ## Other * [Activity Selection](https://github.com/TheAlgorithms/Python/blob/master/other/activity_selection.py) * [Check Strong Password](https://github.com/TheAlgorithms/Python/blob/master/other/check_strong_password.py) * [Date To Weekday](https://github.com/TheAlgorithms/Python/blob/master/other/date_to_weekday.py) * [Davisb Putnamb Logemannb Loveland](https://github.com/TheAlgorithms/Python/blob/master/other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](https://github.com/TheAlgorithms/Python/blob/master/other/dijkstra_bankers_algorithm.py) * [Doomsday](https://github.com/TheAlgorithms/Python/blob/master/other/doomsday.py) * [Fischer Yates Shuffle](https://github.com/TheAlgorithms/Python/blob/master/other/fischer_yates_shuffle.py) * [Gauss Easter](https://github.com/TheAlgorithms/Python/blob/master/other/gauss_easter.py) * [Graham Scan](https://github.com/TheAlgorithms/Python/blob/master/other/graham_scan.py) * [Greedy](https://github.com/TheAlgorithms/Python/blob/master/other/greedy.py) * [Least Recently Used](https://github.com/TheAlgorithms/Python/blob/master/other/least_recently_used.py) * [Lfu Cache](https://github.com/TheAlgorithms/Python/blob/master/other/lfu_cache.py) * [Linear Congruential Generator](https://github.com/TheAlgorithms/Python/blob/master/other/linear_congruential_generator.py) * [Lru Cache](https://github.com/TheAlgorithms/Python/blob/master/other/lru_cache.py) * [Magicdiamondpattern](https://github.com/TheAlgorithms/Python/blob/master/other/magicdiamondpattern.py) * [Nested Brackets](https://github.com/TheAlgorithms/Python/blob/master/other/nested_brackets.py) * [Password Generator](https://github.com/TheAlgorithms/Python/blob/master/other/password_generator.py) * [Scoring Algorithm](https://github.com/TheAlgorithms/Python/blob/master/other/scoring_algorithm.py) * [Sdes](https://github.com/TheAlgorithms/Python/blob/master/other/sdes.py) * [Tower Of Hanoi](https://github.com/TheAlgorithms/Python/blob/master/other/tower_of_hanoi.py) ## Physics * [N Body Simulation](https://github.com/TheAlgorithms/Python/blob/master/physics/n_body_simulation.py) ## Project Euler * Problem 001 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol4.py) * [Sol5](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol5.py) * [Sol6](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol6.py) * [Sol7](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol4.py) * [Sol5](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_003/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_003/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_004/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_005/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_007/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_007/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_008/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_008/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_009/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_009/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_010/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_010/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_011/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_012/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_014/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_016/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_017/sol1.py) * Problem 018 * [Solution](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_018/solution.py) * Problem 019 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol3.py) * [Sol4](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_022/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_025/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_025/sol2.py) * [Sol3](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_031/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_054/sol1.py) * [Test Poker Hand](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_067/sol1.py) * Problem 069 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_072/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_072/sol2.py) * Problem 074 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_074/sol1.py) * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_102/sol1.py) * Problem 107 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_113/sol1.py) * Problem 119 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_144/sol1.py) * Problem 173 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_203/sol1.py) * Problem 206 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_301/sol1.py) * Problem 551 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_551/sol1.py) * Problem 686 * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_686/sol1.py) ## Quantum * [Deutsch Jozsa](https://github.com/TheAlgorithms/Python/blob/master/quantum/deutsch_jozsa.py) * [Half Adder](https://github.com/TheAlgorithms/Python/blob/master/quantum/half_adder.py) * [Not Gate](https://github.com/TheAlgorithms/Python/blob/master/quantum/not_gate.py) * [Quantum Entanglement](https://github.com/TheAlgorithms/Python/blob/master/quantum/quantum_entanglement.py) * [Ripple Adder Classic](https://github.com/TheAlgorithms/Python/blob/master/quantum/ripple_adder_classic.py) * [Single Qubit Measure](https://github.com/TheAlgorithms/Python/blob/master/quantum/single_qubit_measure.py) ## Scheduling * [First Come First Served](https://github.com/TheAlgorithms/Python/blob/master/scheduling/first_come_first_served.py) * [Round Robin](https://github.com/TheAlgorithms/Python/blob/master/scheduling/round_robin.py) * [Shortest Job First](https://github.com/TheAlgorithms/Python/blob/master/scheduling/shortest_job_first.py) ## Searches * [Binary Search](https://github.com/TheAlgorithms/Python/blob/master/searches/binary_search.py) * [Binary Tree Traversal](https://github.com/TheAlgorithms/Python/blob/master/searches/binary_tree_traversal.py) * [Double Linear Search](https://github.com/TheAlgorithms/Python/blob/master/searches/double_linear_search.py) * [Double Linear Search Recursion](https://github.com/TheAlgorithms/Python/blob/master/searches/double_linear_search_recursion.py) * [Fibonacci Search](https://github.com/TheAlgorithms/Python/blob/master/searches/fibonacci_search.py) * [Hill Climbing](https://github.com/TheAlgorithms/Python/blob/master/searches/hill_climbing.py) * [Interpolation Search](https://github.com/TheAlgorithms/Python/blob/master/searches/interpolation_search.py) * [Jump Search](https://github.com/TheAlgorithms/Python/blob/master/searches/jump_search.py) * [Linear Search](https://github.com/TheAlgorithms/Python/blob/master/searches/linear_search.py) * [Quick Select](https://github.com/TheAlgorithms/Python/blob/master/searches/quick_select.py) * [Sentinel Linear Search](https://github.com/TheAlgorithms/Python/blob/master/searches/sentinel_linear_search.py) * [Simple Binary Search](https://github.com/TheAlgorithms/Python/blob/master/searches/simple_binary_search.py) * [Simulated Annealing](https://github.com/TheAlgorithms/Python/blob/master/searches/simulated_annealing.py) * [Tabu Search](https://github.com/TheAlgorithms/Python/blob/master/searches/tabu_search.py) * [Ternary Search](https://github.com/TheAlgorithms/Python/blob/master/searches/ternary_search.py) ## Sorts * [Bead Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bead_sort.py) * [Bitonic Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bitonic_sort.py) * [Bogo Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bogo_sort.py) * [Bubble Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bubble_sort.py) * [Bucket Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/bucket_sort.py) * [Cocktail Shaker Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/cocktail_shaker_sort.py) * [Comb Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/comb_sort.py) * [Counting Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/counting_sort.py) * [Cycle Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/cycle_sort.py) * [Double Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/double_sort.py) * [Dutch National Flag Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/dutch_national_flag_sort.py) * [Exchange Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/exchange_sort.py) * [External Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/external_sort.py) * [Gnome Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/gnome_sort.py) * [Heap Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/heap_sort.py) * [Insertion Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/insertion_sort.py) * [Intro Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/intro_sort.py) * [Iterative Merge Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/iterative_merge_sort.py) * [Merge Insertion Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/merge_insertion_sort.py) * [Merge Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/merge_sort.py) * [Msd Radix Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/msd_radix_sort.py) * [Natural Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/natural_sort.py) * [Odd Even Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](https://github.com/TheAlgorithms/Python/blob/master/sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](https://github.com/TheAlgorithms/Python/blob/master/sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/pancake_sort.py) * [Patience Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/patience_sort.py) * [Pigeon Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/pigeon_sort.py) * [Pigeonhole Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/pigeonhole_sort.py) * [Quick Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/quick_sort.py) * [Quick Sort 3 Partition](https://github.com/TheAlgorithms/Python/blob/master/sorts/quick_sort_3_partition.py) * [Radix Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/radix_sort.py) * [Random Normal Distribution Quicksort](https://github.com/TheAlgorithms/Python/blob/master/sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/recursive_quick_sort.py) * [Selection Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/selection_sort.py) * [Shell Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/shell_sort.py) * [Slowsort](https://github.com/TheAlgorithms/Python/blob/master/sorts/slowsort.py) * [Stooge Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/stooge_sort.py) * [Strand Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/strand_sort.py) * [Tim Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/tim_sort.py) * [Topological Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/topological_sort.py) * [Tree Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/tree_sort.py) * [Unknown Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/unknown_sort.py) * [Wiggle Sort](https://github.com/TheAlgorithms/Python/blob/master/sorts/wiggle_sort.py) ## Strings * [Aho Corasick](https://github.com/TheAlgorithms/Python/blob/master/strings/aho_corasick.py) * [Alternative String Arrange](https://github.com/TheAlgorithms/Python/blob/master/strings/alternative_string_arrange.py) * [Anagrams](https://github.com/TheAlgorithms/Python/blob/master/strings/anagrams.py) * [Autocomplete Using Trie](https://github.com/TheAlgorithms/Python/blob/master/strings/autocomplete_using_trie.py) * [Boyer Moore Search](https://github.com/TheAlgorithms/Python/blob/master/strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](https://github.com/TheAlgorithms/Python/blob/master/strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](https://github.com/TheAlgorithms/Python/blob/master/strings/capitalize.py) * [Check Anagrams](https://github.com/TheAlgorithms/Python/blob/master/strings/check_anagrams.py) * [Check Pangram](https://github.com/TheAlgorithms/Python/blob/master/strings/check_pangram.py) * [Credit Card Validator](https://github.com/TheAlgorithms/Python/blob/master/strings/credit_card_validator.py) * [Detecting English Programmatically](https://github.com/TheAlgorithms/Python/blob/master/strings/detecting_english_programmatically.py) * [Frequency Finder](https://github.com/TheAlgorithms/Python/blob/master/strings/frequency_finder.py) * [Indian Phone Validator](https://github.com/TheAlgorithms/Python/blob/master/strings/indian_phone_validator.py) * [Is Palindrome](https://github.com/TheAlgorithms/Python/blob/master/strings/is_palindrome.py) * [Jaro Winkler](https://github.com/TheAlgorithms/Python/blob/master/strings/jaro_winkler.py) * [Join](https://github.com/TheAlgorithms/Python/blob/master/strings/join.py) * [Knuth Morris Pratt](https://github.com/TheAlgorithms/Python/blob/master/strings/knuth_morris_pratt.py) * [Levenshtein Distance](https://github.com/TheAlgorithms/Python/blob/master/strings/levenshtein_distance.py) * [Lower](https://github.com/TheAlgorithms/Python/blob/master/strings/lower.py) * [Manacher](https://github.com/TheAlgorithms/Python/blob/master/strings/manacher.py) * [Min Cost String Conversion](https://github.com/TheAlgorithms/Python/blob/master/strings/min_cost_string_conversion.py) * [Naive String Search](https://github.com/TheAlgorithms/Python/blob/master/strings/naive_string_search.py) * [Palindrome](https://github.com/TheAlgorithms/Python/blob/master/strings/palindrome.py) * [Prefix Function](https://github.com/TheAlgorithms/Python/blob/master/strings/prefix_function.py) * [Rabin Karp](https://github.com/TheAlgorithms/Python/blob/master/strings/rabin_karp.py) * [Remove Duplicate](https://github.com/TheAlgorithms/Python/blob/master/strings/remove_duplicate.py) * [Reverse Letters](https://github.com/TheAlgorithms/Python/blob/master/strings/reverse_letters.py) * [Reverse Long Words](https://github.com/TheAlgorithms/Python/blob/master/strings/reverse_long_words.py) * [Reverse Words](https://github.com/TheAlgorithms/Python/blob/master/strings/reverse_words.py) * [Split](https://github.com/TheAlgorithms/Python/blob/master/strings/split.py) * [Upper](https://github.com/TheAlgorithms/Python/blob/master/strings/upper.py) * [Wildcard Pattern Matching](https://github.com/TheAlgorithms/Python/blob/master/strings/wildcard_pattern_matching.py) * [Word Occurrence](https://github.com/TheAlgorithms/Python/blob/master/strings/word_occurrence.py) * [Word Patterns](https://github.com/TheAlgorithms/Python/blob/master/strings/word_patterns.py) * [Z Function](https://github.com/TheAlgorithms/Python/blob/master/strings/z_function.py) ## Web Programming * [Co2 Emission](https://github.com/TheAlgorithms/Python/blob/master/web_programming/co2_emission.py) * [Covid Stats Via Xpath](https://github.com/TheAlgorithms/Python/blob/master/web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](https://github.com/TheAlgorithms/Python/blob/master/web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](https://github.com/TheAlgorithms/Python/blob/master/web_programming/crawl_google_scholar_citation.py) * [Currency Converter](https://github.com/TheAlgorithms/Python/blob/master/web_programming/currency_converter.py) * [Current Stock Price](https://github.com/TheAlgorithms/Python/blob/master/web_programming/current_stock_price.py) * [Current Weather](https://github.com/TheAlgorithms/Python/blob/master/web_programming/current_weather.py) * [Daily Horoscope](https://github.com/TheAlgorithms/Python/blob/master/web_programming/daily_horoscope.py) * [Download Images From Google Query](https://github.com/TheAlgorithms/Python/blob/master/web_programming/download_images_from_google_query.py) * [Emails From Url](https://github.com/TheAlgorithms/Python/blob/master/web_programming/emails_from_url.py) * [Fetch Bbc News](https://github.com/TheAlgorithms/Python/blob/master/web_programming/fetch_bbc_news.py) * [Fetch Github Info](https://github.com/TheAlgorithms/Python/blob/master/web_programming/fetch_github_info.py) * [Fetch Jobs](https://github.com/TheAlgorithms/Python/blob/master/web_programming/fetch_jobs.py) * [Get Imdb Top 250 Movies Csv](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_imdbtop.py) * [Get Top Hn Posts](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_top_hn_posts.py) * [Get User Tweets](https://github.com/TheAlgorithms/Python/blob/master/web_programming/get_user_tweets.py) * [Giphy](https://github.com/TheAlgorithms/Python/blob/master/web_programming/giphy.py) * [Instagram Crawler](https://github.com/TheAlgorithms/Python/blob/master/web_programming/instagram_crawler.py) * [Instagram Pic](https://github.com/TheAlgorithms/Python/blob/master/web_programming/instagram_pic.py) * [Instagram Video](https://github.com/TheAlgorithms/Python/blob/master/web_programming/instagram_video.py) * [Nasa Data](https://github.com/TheAlgorithms/Python/blob/master/web_programming/nasa_data.py) * [Random Anime Character](https://github.com/TheAlgorithms/Python/blob/master/web_programming/random_anime_character.py) * [Recaptcha Verification](https://github.com/TheAlgorithms/Python/blob/master/web_programming/recaptcha_verification.py) * [Slack Message](https://github.com/TheAlgorithms/Python/blob/master/web_programming/slack_message.py) * [Test Fetch Github Info](https://github.com/TheAlgorithms/Python/blob/master/web_programming/test_fetch_github_info.py) * [World Covid19 Stats](https://github.com/TheAlgorithms/Python/blob/master/web_programming/world_covid19_stats.py)
1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/local/bin/python3 """ Problem Description: Given two binary tree, return the merged tree. The rule for merging is that if two nodes overlap, then put the value sum of both nodes to the new value of the merged node. Otherwise, the NOT null node will be used as the node of new tree. """ from __future__ import annotations from typing import Optional class Node: """ A binary node has value variable and pointers to its left and right node. """ def __init__(self, value: int = 0) -> None: self.value = value self.left: Node | None = None self.right: Node | None = None def merge_two_binary_trees(tree1: Node | None, tree2: Node | None) -> Optional[Node]: """ Returns root node of the merged tree. >>> tree1 = Node(5) >>> tree1.left = Node(6) >>> tree1.right = Node(7) >>> tree1.left.left = Node(2) >>> tree2 = Node(4) >>> tree2.left = Node(5) >>> tree2.right = Node(8) >>> tree2.left.right = Node(1) >>> tree2.right.right = Node(4) >>> merged_tree = merge_two_binary_trees(tree1, tree2) >>> print_preorder(merged_tree) 9 11 2 1 15 4 """ if tree1 is None: return tree2 if tree2 is None: return tree1 tree1.value = tree1.value + tree2.value tree1.left = merge_two_binary_trees(tree1.left, tree2.left) tree1.right = merge_two_binary_trees(tree1.right, tree2.right) return tree1 def print_preorder(root: Node | None) -> None: """ Print pre-order traversal of the tree. >>> root = Node(1) >>> root.left = Node(2) >>> root.right = Node(3) >>> print_preorder(root) 1 2 3 >>> print_preorder(root.right) 3 """ if root: print(root.value) print_preorder(root.left) print_preorder(root.right) if __name__ == "__main__": tree1 = Node(1) tree1.left = Node(2) tree1.right = Node(3) tree1.left.left = Node(4) tree2 = Node(2) tree2.left = Node(4) tree2.right = Node(6) tree2.left.right = Node(9) tree2.right.right = Node(5) print("Tree1 is: ") print_preorder(tree1) print("Tree2 is: ") print_preorder(tree2) merged_tree = merge_two_binary_trees(tree1, tree2) print("Merged Tree is: ") print_preorder(merged_tree)
#!/usr/local/bin/python3 """ Problem Description: Given two binary tree, return the merged tree. The rule for merging is that if two nodes overlap, then put the value sum of both nodes to the new value of the merged node. Otherwise, the NOT null node will be used as the node of new tree. """ from __future__ import annotations class Node: """ A binary node has value variable and pointers to its left and right node. """ def __init__(self, value: int = 0) -> None: self.value = value self.left: Node | None = None self.right: Node | None = None def merge_two_binary_trees(tree1: Node | None, tree2: Node | None) -> Node | None: """ Returns root node of the merged tree. >>> tree1 = Node(5) >>> tree1.left = Node(6) >>> tree1.right = Node(7) >>> tree1.left.left = Node(2) >>> tree2 = Node(4) >>> tree2.left = Node(5) >>> tree2.right = Node(8) >>> tree2.left.right = Node(1) >>> tree2.right.right = Node(4) >>> merged_tree = merge_two_binary_trees(tree1, tree2) >>> print_preorder(merged_tree) 9 11 2 1 15 4 """ if tree1 is None: return tree2 if tree2 is None: return tree1 tree1.value = tree1.value + tree2.value tree1.left = merge_two_binary_trees(tree1.left, tree2.left) tree1.right = merge_two_binary_trees(tree1.right, tree2.right) return tree1 def print_preorder(root: Node | None) -> None: """ Print pre-order traversal of the tree. >>> root = Node(1) >>> root.left = Node(2) >>> root.right = Node(3) >>> print_preorder(root) 1 2 3 >>> print_preorder(root.right) 3 """ if root: print(root.value) print_preorder(root.left) print_preorder(root.right) if __name__ == "__main__": tree1 = Node(1) tree1.left = Node(2) tree1.right = Node(3) tree1.left.left = Node(4) tree2 = Node(2) tree2.left = Node(4) tree2.right = Node(6) tree2.left.right = Node(9) tree2.right.right = Node(5) print("Tree1 is: ") print_preorder(tree1) print("Tree2 is: ") print_preorder(tree2) merged_tree = merge_two_binary_trees(tree1, tree2) print("Merged Tree is: ") print_preorder(merged_tree)
1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Based on "Skip Lists: A Probabilistic Alternative to Balanced Trees" by William Pugh https://epaperpress.com/sortsearch/download/skiplist.pdf """ from __future__ import annotations from random import random from typing import Generic, Optional, TypeVar, Union KT = TypeVar("KT") VT = TypeVar("VT") class Node(Generic[KT, VT]): def __init__(self, key: Union[KT, str] = "root", value: Optional[VT] = None): self.key = key self.value = value self.forward: list[Node[KT, VT]] = [] def __repr__(self) -> str: """ :return: Visual representation of Node >>> node = Node("Key", 2) >>> repr(node) 'Node(Key: 2)' """ return f"Node({self.key}: {self.value})" @property def level(self) -> int: """ :return: Number of forward references >>> node = Node("Key", 2) >>> node.level 0 >>> node.forward.append(Node("Key2", 4)) >>> node.level 1 >>> node.forward.append(Node("Key3", 6)) >>> node.level 2 """ return len(self.forward) class SkipList(Generic[KT, VT]): def __init__(self, p: float = 0.5, max_level: int = 16): self.head: Node[KT, VT] = Node[KT, VT]() self.level = 0 self.p = p self.max_level = max_level def __str__(self) -> str: """ :return: Visual representation of SkipList >>> skip_list = SkipList() >>> print(skip_list) SkipList(level=0) >>> skip_list.insert("Key1", "Value") >>> print(skip_list) # doctest: +ELLIPSIS SkipList(level=... [root]--... [Key1]--Key1... None *... >>> skip_list.insert("Key2", "OtherValue") >>> print(skip_list) # doctest: +ELLIPSIS SkipList(level=... [root]--... [Key1]--Key1... [Key2]--Key2... None *... """ items = list(self) if len(items) == 0: return f"SkipList(level={self.level})" label_size = max((len(str(item)) for item in items), default=4) label_size = max(label_size, 4) + 4 node = self.head lines = [] forwards = node.forward.copy() lines.append(f"[{node.key}]".ljust(label_size, "-") + "* " * len(forwards)) lines.append(" " * label_size + "| " * len(forwards)) while len(node.forward) != 0: node = node.forward[0] lines.append( f"[{node.key}]".ljust(label_size, "-") + " ".join(str(n.key) if n.key == node.key else "|" for n in forwards) ) lines.append(" " * label_size + "| " * len(forwards)) forwards[: node.level] = node.forward lines.append("None".ljust(label_size) + "* " * len(forwards)) return f"SkipList(level={self.level})\n" + "\n".join(lines) def __iter__(self): node = self.head while len(node.forward) != 0: yield node.forward[0].key node = node.forward[0] def random_level(self) -> int: """ :return: Random level from [1, self.max_level] interval. Higher values are less likely. """ level = 1 while random() < self.p and level < self.max_level: level += 1 return level def _locate_node(self, key) -> tuple[Node[KT, VT] | None, list[Node[KT, VT]]]: """ :param key: Searched key, :return: Tuple with searched node (or None if given key is not present) and list of nodes that refer (if key is present) of should refer to given node. """ # Nodes with refer or should refer to output node update_vector = [] node = self.head for i in reversed(range(self.level)): # i < node.level - When node level is lesser than `i` decrement `i`. # node.forward[i].key < key - Jumping to node with key value higher # or equal to searched key would result # in skipping searched key. while i < node.level and node.forward[i].key < key: node = node.forward[i] # Each leftmost node (relative to searched node) will potentially have to # be updated. update_vector.append(node) update_vector.reverse() # Note that we were inserting values in reverse order. # len(node.forward) != 0 - If current node doesn't contain any further # references then searched key is not present. # node.forward[0].key == key - Next node key should be equal to search key # if key is present. if len(node.forward) != 0 and node.forward[0].key == key: return node.forward[0], update_vector else: return None, update_vector def delete(self, key: KT): """ :param key: Key to remove from list. >>> skip_list = SkipList() >>> skip_list.insert(2, "Two") >>> skip_list.insert(1, "One") >>> skip_list.insert(3, "Three") >>> list(skip_list) [1, 2, 3] >>> skip_list.delete(2) >>> list(skip_list) [1, 3] """ node, update_vector = self._locate_node(key) if node is not None: for i, update_node in enumerate(update_vector): # Remove or replace all references to removed node. if update_node.level > i and update_node.forward[i].key == key: if node.level > i: update_node.forward[i] = node.forward[i] else: update_node.forward = update_node.forward[:i] def insert(self, key: KT, value: VT): """ :param key: Key to insert. :param value: Value associated with given key. >>> skip_list = SkipList() >>> skip_list.insert(2, "Two") >>> skip_list.find(2) 'Two' >>> list(skip_list) [2] """ node, update_vector = self._locate_node(key) if node is not None: node.value = value else: level = self.random_level() if level > self.level: # After level increase we have to add additional nodes to head. for i in range(self.level - 1, level): update_vector.append(self.head) self.level = level new_node = Node(key, value) for i, update_node in enumerate(update_vector[:level]): # Change references to pass through new node. if update_node.level > i: new_node.forward.append(update_node.forward[i]) if update_node.level < i + 1: update_node.forward.append(new_node) else: update_node.forward[i] = new_node def find(self, key: VT) -> VT | None: """ :param key: Search key. :return: Value associated with given key or None if given key is not present. >>> skip_list = SkipList() >>> skip_list.find(2) >>> skip_list.insert(2, "Two") >>> skip_list.find(2) 'Two' >>> skip_list.insert(2, "Three") >>> skip_list.find(2) 'Three' """ node, _ = self._locate_node(key) if node is not None: return node.value return None def test_insert(): skip_list = SkipList() skip_list.insert("Key1", 3) skip_list.insert("Key2", 12) skip_list.insert("Key3", 41) skip_list.insert("Key4", -19) node = skip_list.head all_values = {} while node.level != 0: node = node.forward[0] all_values[node.key] = node.value assert len(all_values) == 4 assert all_values["Key1"] == 3 assert all_values["Key2"] == 12 assert all_values["Key3"] == 41 assert all_values["Key4"] == -19 def test_insert_overrides_existing_value(): skip_list = SkipList() skip_list.insert("Key1", 10) skip_list.insert("Key1", 12) skip_list.insert("Key5", 7) skip_list.insert("Key7", 10) skip_list.insert("Key10", 5) skip_list.insert("Key7", 7) skip_list.insert("Key5", 5) skip_list.insert("Key10", 10) node = skip_list.head all_values = {} while node.level != 0: node = node.forward[0] all_values[node.key] = node.value if len(all_values) != 4: print() assert len(all_values) == 4 assert all_values["Key1"] == 12 assert all_values["Key7"] == 7 assert all_values["Key5"] == 5 assert all_values["Key10"] == 10 def test_searching_empty_list_returns_none(): skip_list = SkipList() assert skip_list.find("Some key") is None def test_search(): skip_list = SkipList() skip_list.insert("Key2", 20) assert skip_list.find("Key2") == 20 skip_list.insert("Some Key", 10) skip_list.insert("Key2", 8) skip_list.insert("V", 13) assert skip_list.find("Y") is None assert skip_list.find("Key2") == 8 assert skip_list.find("Some Key") == 10 assert skip_list.find("V") == 13 def test_deleting_item_from_empty_list_do_nothing(): skip_list = SkipList() skip_list.delete("Some key") assert len(skip_list.head.forward) == 0 def test_deleted_items_are_not_founded_by_find_method(): skip_list = SkipList() skip_list.insert("Key1", 12) skip_list.insert("V", 13) skip_list.insert("X", 14) skip_list.insert("Key2", 15) skip_list.delete("V") skip_list.delete("Key2") assert skip_list.find("V") is None assert skip_list.find("Key2") is None def test_delete_removes_only_given_key(): skip_list = SkipList() skip_list.insert("Key1", 12) skip_list.insert("V", 13) skip_list.insert("X", 14) skip_list.insert("Key2", 15) skip_list.delete("V") assert skip_list.find("V") is None assert skip_list.find("X") == 14 assert skip_list.find("Key1") == 12 assert skip_list.find("Key2") == 15 skip_list.delete("X") assert skip_list.find("V") is None assert skip_list.find("X") is None assert skip_list.find("Key1") == 12 assert skip_list.find("Key2") == 15 skip_list.delete("Key1") assert skip_list.find("V") is None assert skip_list.find("X") is None assert skip_list.find("Key1") is None assert skip_list.find("Key2") == 15 skip_list.delete("Key2") assert skip_list.find("V") is None assert skip_list.find("X") is None assert skip_list.find("Key1") is None assert skip_list.find("Key2") is None def test_delete_doesnt_leave_dead_nodes(): skip_list = SkipList() skip_list.insert("Key1", 12) skip_list.insert("V", 13) skip_list.insert("X", 142) skip_list.insert("Key2", 15) skip_list.delete("X") def traverse_keys(node): yield node.key for forward_node in node.forward: yield from traverse_keys(forward_node) assert len(set(traverse_keys(skip_list.head))) == 4 def test_iter_always_yields_sorted_values(): def is_sorted(lst): for item, next_item in zip(lst, lst[1:]): if next_item < item: return False return True skip_list = SkipList() for i in range(10): skip_list.insert(i, i) assert is_sorted(list(skip_list)) skip_list.delete(5) skip_list.delete(8) skip_list.delete(2) assert is_sorted(list(skip_list)) skip_list.insert(-12, -12) skip_list.insert(77, 77) assert is_sorted(list(skip_list)) def pytests(): for i in range(100): # Repeat test 100 times due to the probabilistic nature of skip list # random values == random bugs test_insert() test_insert_overrides_existing_value() test_searching_empty_list_returns_none() test_search() test_deleting_item_from_empty_list_do_nothing() test_deleted_items_are_not_founded_by_find_method() test_delete_removes_only_given_key() test_delete_doesnt_leave_dead_nodes() test_iter_always_yields_sorted_values() def main(): """ >>> pytests() """ skip_list = SkipList() skip_list.insert(2, "2") skip_list.insert(4, "4") skip_list.insert(6, "4") skip_list.insert(4, "5") skip_list.insert(8, "4") skip_list.insert(9, "4") skip_list.delete(4) print(skip_list) if __name__ == "__main__": main()
""" Based on "Skip Lists: A Probabilistic Alternative to Balanced Trees" by William Pugh https://epaperpress.com/sortsearch/download/skiplist.pdf """ from __future__ import annotations from random import random from typing import Generic, TypeVar KT = TypeVar("KT") VT = TypeVar("VT") class Node(Generic[KT, VT]): def __init__(self, key: KT | str = "root", value: VT | None = None): self.key = key self.value = value self.forward: list[Node[KT, VT]] = [] def __repr__(self) -> str: """ :return: Visual representation of Node >>> node = Node("Key", 2) >>> repr(node) 'Node(Key: 2)' """ return f"Node({self.key}: {self.value})" @property def level(self) -> int: """ :return: Number of forward references >>> node = Node("Key", 2) >>> node.level 0 >>> node.forward.append(Node("Key2", 4)) >>> node.level 1 >>> node.forward.append(Node("Key3", 6)) >>> node.level 2 """ return len(self.forward) class SkipList(Generic[KT, VT]): def __init__(self, p: float = 0.5, max_level: int = 16): self.head: Node[KT, VT] = Node[KT, VT]() self.level = 0 self.p = p self.max_level = max_level def __str__(self) -> str: """ :return: Visual representation of SkipList >>> skip_list = SkipList() >>> print(skip_list) SkipList(level=0) >>> skip_list.insert("Key1", "Value") >>> print(skip_list) # doctest: +ELLIPSIS SkipList(level=... [root]--... [Key1]--Key1... None *... >>> skip_list.insert("Key2", "OtherValue") >>> print(skip_list) # doctest: +ELLIPSIS SkipList(level=... [root]--... [Key1]--Key1... [Key2]--Key2... None *... """ items = list(self) if len(items) == 0: return f"SkipList(level={self.level})" label_size = max((len(str(item)) for item in items), default=4) label_size = max(label_size, 4) + 4 node = self.head lines = [] forwards = node.forward.copy() lines.append(f"[{node.key}]".ljust(label_size, "-") + "* " * len(forwards)) lines.append(" " * label_size + "| " * len(forwards)) while len(node.forward) != 0: node = node.forward[0] lines.append( f"[{node.key}]".ljust(label_size, "-") + " ".join(str(n.key) if n.key == node.key else "|" for n in forwards) ) lines.append(" " * label_size + "| " * len(forwards)) forwards[: node.level] = node.forward lines.append("None".ljust(label_size) + "* " * len(forwards)) return f"SkipList(level={self.level})\n" + "\n".join(lines) def __iter__(self): node = self.head while len(node.forward) != 0: yield node.forward[0].key node = node.forward[0] def random_level(self) -> int: """ :return: Random level from [1, self.max_level] interval. Higher values are less likely. """ level = 1 while random() < self.p and level < self.max_level: level += 1 return level def _locate_node(self, key) -> tuple[Node[KT, VT] | None, list[Node[KT, VT]]]: """ :param key: Searched key, :return: Tuple with searched node (or None if given key is not present) and list of nodes that refer (if key is present) of should refer to given node. """ # Nodes with refer or should refer to output node update_vector = [] node = self.head for i in reversed(range(self.level)): # i < node.level - When node level is lesser than `i` decrement `i`. # node.forward[i].key < key - Jumping to node with key value higher # or equal to searched key would result # in skipping searched key. while i < node.level and node.forward[i].key < key: node = node.forward[i] # Each leftmost node (relative to searched node) will potentially have to # be updated. update_vector.append(node) update_vector.reverse() # Note that we were inserting values in reverse order. # len(node.forward) != 0 - If current node doesn't contain any further # references then searched key is not present. # node.forward[0].key == key - Next node key should be equal to search key # if key is present. if len(node.forward) != 0 and node.forward[0].key == key: return node.forward[0], update_vector else: return None, update_vector def delete(self, key: KT): """ :param key: Key to remove from list. >>> skip_list = SkipList() >>> skip_list.insert(2, "Two") >>> skip_list.insert(1, "One") >>> skip_list.insert(3, "Three") >>> list(skip_list) [1, 2, 3] >>> skip_list.delete(2) >>> list(skip_list) [1, 3] """ node, update_vector = self._locate_node(key) if node is not None: for i, update_node in enumerate(update_vector): # Remove or replace all references to removed node. if update_node.level > i and update_node.forward[i].key == key: if node.level > i: update_node.forward[i] = node.forward[i] else: update_node.forward = update_node.forward[:i] def insert(self, key: KT, value: VT): """ :param key: Key to insert. :param value: Value associated with given key. >>> skip_list = SkipList() >>> skip_list.insert(2, "Two") >>> skip_list.find(2) 'Two' >>> list(skip_list) [2] """ node, update_vector = self._locate_node(key) if node is not None: node.value = value else: level = self.random_level() if level > self.level: # After level increase we have to add additional nodes to head. for i in range(self.level - 1, level): update_vector.append(self.head) self.level = level new_node = Node(key, value) for i, update_node in enumerate(update_vector[:level]): # Change references to pass through new node. if update_node.level > i: new_node.forward.append(update_node.forward[i]) if update_node.level < i + 1: update_node.forward.append(new_node) else: update_node.forward[i] = new_node def find(self, key: VT) -> VT | None: """ :param key: Search key. :return: Value associated with given key or None if given key is not present. >>> skip_list = SkipList() >>> skip_list.find(2) >>> skip_list.insert(2, "Two") >>> skip_list.find(2) 'Two' >>> skip_list.insert(2, "Three") >>> skip_list.find(2) 'Three' """ node, _ = self._locate_node(key) if node is not None: return node.value return None def test_insert(): skip_list = SkipList() skip_list.insert("Key1", 3) skip_list.insert("Key2", 12) skip_list.insert("Key3", 41) skip_list.insert("Key4", -19) node = skip_list.head all_values = {} while node.level != 0: node = node.forward[0] all_values[node.key] = node.value assert len(all_values) == 4 assert all_values["Key1"] == 3 assert all_values["Key2"] == 12 assert all_values["Key3"] == 41 assert all_values["Key4"] == -19 def test_insert_overrides_existing_value(): skip_list = SkipList() skip_list.insert("Key1", 10) skip_list.insert("Key1", 12) skip_list.insert("Key5", 7) skip_list.insert("Key7", 10) skip_list.insert("Key10", 5) skip_list.insert("Key7", 7) skip_list.insert("Key5", 5) skip_list.insert("Key10", 10) node = skip_list.head all_values = {} while node.level != 0: node = node.forward[0] all_values[node.key] = node.value if len(all_values) != 4: print() assert len(all_values) == 4 assert all_values["Key1"] == 12 assert all_values["Key7"] == 7 assert all_values["Key5"] == 5 assert all_values["Key10"] == 10 def test_searching_empty_list_returns_none(): skip_list = SkipList() assert skip_list.find("Some key") is None def test_search(): skip_list = SkipList() skip_list.insert("Key2", 20) assert skip_list.find("Key2") == 20 skip_list.insert("Some Key", 10) skip_list.insert("Key2", 8) skip_list.insert("V", 13) assert skip_list.find("Y") is None assert skip_list.find("Key2") == 8 assert skip_list.find("Some Key") == 10 assert skip_list.find("V") == 13 def test_deleting_item_from_empty_list_do_nothing(): skip_list = SkipList() skip_list.delete("Some key") assert len(skip_list.head.forward) == 0 def test_deleted_items_are_not_founded_by_find_method(): skip_list = SkipList() skip_list.insert("Key1", 12) skip_list.insert("V", 13) skip_list.insert("X", 14) skip_list.insert("Key2", 15) skip_list.delete("V") skip_list.delete("Key2") assert skip_list.find("V") is None assert skip_list.find("Key2") is None def test_delete_removes_only_given_key(): skip_list = SkipList() skip_list.insert("Key1", 12) skip_list.insert("V", 13) skip_list.insert("X", 14) skip_list.insert("Key2", 15) skip_list.delete("V") assert skip_list.find("V") is None assert skip_list.find("X") == 14 assert skip_list.find("Key1") == 12 assert skip_list.find("Key2") == 15 skip_list.delete("X") assert skip_list.find("V") is None assert skip_list.find("X") is None assert skip_list.find("Key1") == 12 assert skip_list.find("Key2") == 15 skip_list.delete("Key1") assert skip_list.find("V") is None assert skip_list.find("X") is None assert skip_list.find("Key1") is None assert skip_list.find("Key2") == 15 skip_list.delete("Key2") assert skip_list.find("V") is None assert skip_list.find("X") is None assert skip_list.find("Key1") is None assert skip_list.find("Key2") is None def test_delete_doesnt_leave_dead_nodes(): skip_list = SkipList() skip_list.insert("Key1", 12) skip_list.insert("V", 13) skip_list.insert("X", 142) skip_list.insert("Key2", 15) skip_list.delete("X") def traverse_keys(node): yield node.key for forward_node in node.forward: yield from traverse_keys(forward_node) assert len(set(traverse_keys(skip_list.head))) == 4 def test_iter_always_yields_sorted_values(): def is_sorted(lst): for item, next_item in zip(lst, lst[1:]): if next_item < item: return False return True skip_list = SkipList() for i in range(10): skip_list.insert(i, i) assert is_sorted(list(skip_list)) skip_list.delete(5) skip_list.delete(8) skip_list.delete(2) assert is_sorted(list(skip_list)) skip_list.insert(-12, -12) skip_list.insert(77, 77) assert is_sorted(list(skip_list)) def pytests(): for i in range(100): # Repeat test 100 times due to the probabilistic nature of skip list # random values == random bugs test_insert() test_insert_overrides_existing_value() test_searching_empty_list_returns_none() test_search() test_deleting_item_from_empty_list_do_nothing() test_deleted_items_are_not_founded_by_find_method() test_delete_removes_only_given_key() test_delete_doesnt_leave_dead_nodes() test_iter_always_yields_sorted_values() def main(): """ >>> pytests() """ skip_list = SkipList() skip_list.insert(2, "2") skip_list.insert(4, "4") skip_list.insert(6, "4") skip_list.insert(4, "5") skip_list.insert(8, "4") skip_list.insert(9, "4") skip_list.delete(4) print(skip_list) if __name__ == "__main__": main()
1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Problem 46: https://projecteuler.net/problem=46 It was proposed by Christian Goldbach that every odd composite number can be written as the sum of a prime and twice a square. 9 = 7 + 2 × 12 15 = 7 + 2 × 22 21 = 3 + 2 × 32 25 = 7 + 2 × 32 27 = 19 + 2 × 22 33 = 31 + 2 × 12 It turns out that the conjecture was false. What is the smallest odd composite that cannot be written as the sum of a prime and twice a square? """ from __future__ import annotations seive = [True] * 100001 i = 2 while i * i <= 100000: if seive[i]: for j in range(i * i, 100001, i): seive[j] = False i += 1 def is_prime(n: int) -> bool: """ Returns True if n is prime, False otherwise, for 2 <= n <= 100000 >>> is_prime(87) False >>> is_prime(23) True >>> is_prime(25363) False """ return seive[n] odd_composites = [num for num in range(3, len(seive), 2) if not is_prime(num)] def compute_nums(n: int) -> list[int]: """ Returns a list of first n odd composite numbers which do not follow the conjecture. >>> compute_nums(1) [5777] >>> compute_nums(2) [5777, 5993] >>> compute_nums(0) Traceback (most recent call last): ... ValueError: n must be >= 0 >>> compute_nums("a") Traceback (most recent call last): ... ValueError: n must be an integer >>> compute_nums(1.1) Traceback (most recent call last): ... ValueError: n must be an integer """ if not isinstance(n, int): raise ValueError("n must be an integer") if n <= 0: raise ValueError("n must be >= 0") list_nums = [] for num in range(len(odd_composites)): i = 0 while 2 * i * i <= odd_composites[num]: rem = odd_composites[num] - 2 * i * i if is_prime(rem): break i += 1 else: list_nums.append(odd_composites[num]) if len(list_nums) == n: return list_nums return [] def solution() -> int: """Return the solution to the problem""" return compute_nums(1)[0] if __name__ == "__main__": print(f"{solution() = }")
""" Problem 46: https://projecteuler.net/problem=46 It was proposed by Christian Goldbach that every odd composite number can be written as the sum of a prime and twice a square. 9 = 7 + 2 × 12 15 = 7 + 2 × 22 21 = 3 + 2 × 32 25 = 7 + 2 × 32 27 = 19 + 2 × 22 33 = 31 + 2 × 12 It turns out that the conjecture was false. What is the smallest odd composite that cannot be written as the sum of a prime and twice a square? """ from __future__ import annotations seive = [True] * 100001 i = 2 while i * i <= 100000: if seive[i]: for j in range(i * i, 100001, i): seive[j] = False i += 1 def is_prime(n: int) -> bool: """ Returns True if n is prime, False otherwise, for 2 <= n <= 100000 >>> is_prime(87) False >>> is_prime(23) True >>> is_prime(25363) False """ return seive[n] odd_composites = [num for num in range(3, len(seive), 2) if not is_prime(num)] def compute_nums(n: int) -> list[int]: """ Returns a list of first n odd composite numbers which do not follow the conjecture. >>> compute_nums(1) [5777] >>> compute_nums(2) [5777, 5993] >>> compute_nums(0) Traceback (most recent call last): ... ValueError: n must be >= 0 >>> compute_nums("a") Traceback (most recent call last): ... ValueError: n must be an integer >>> compute_nums(1.1) Traceback (most recent call last): ... ValueError: n must be an integer """ if not isinstance(n, int): raise ValueError("n must be an integer") if n <= 0: raise ValueError("n must be >= 0") list_nums = [] for num in range(len(odd_composites)): i = 0 while 2 * i * i <= odd_composites[num]: rem = odd_composites[num] - 2 * i * i if is_prime(rem): break i += 1 else: list_nums.append(odd_composites[num]) if len(list_nums) == n: return list_nums return [] def solution() -> int: """Return the solution to the problem""" return compute_nums(1)[0] if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#
#
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 6: https://projecteuler.net/problem=6 Sum square difference The sum of the squares of the first ten natural numbers is, 1^2 + 2^2 + ... + 10^2 = 385 The square of the sum of the first ten natural numbers is, (1 + 2 + ... + 10)^2 = 55^2 = 3025 Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025 - 385 = 2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum. """ def solution(n: int = 100) -> int: """ Returns the difference between the sum of the squares of the first n natural numbers and the square of the sum. >>> solution(10) 2640 >>> solution(15) 13160 >>> solution(20) 41230 >>> solution(50) 1582700 """ sum_of_squares = n * (n + 1) * (2 * n + 1) / 6 square_of_sum = (n * (n + 1) / 2) ** 2 return int(square_of_sum - sum_of_squares) if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 6: https://projecteuler.net/problem=6 Sum square difference The sum of the squares of the first ten natural numbers is, 1^2 + 2^2 + ... + 10^2 = 385 The square of the sum of the first ten natural numbers is, (1 + 2 + ... + 10)^2 = 55^2 = 3025 Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025 - 385 = 2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum. """ def solution(n: int = 100) -> int: """ Returns the difference between the sum of the squares of the first n natural numbers and the square of the sum. >>> solution(10) 2640 >>> solution(15) 13160 >>> solution(20) 41230 >>> solution(50) 1582700 """ sum_of_squares = n * (n + 1) * (2 * n + 1) / 6 square_of_sum = (n * (n + 1) / 2) ** 2 return int(square_of_sum - sum_of_squares) if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def reverse_letters(input_str: str) -> str: """ Reverses letters in a given string without adjusting the position of the words >>> reverse_letters('The cat in the hat') 'ehT tac ni eht tah' >>> reverse_letters('The quick brown fox jumped over the lazy dog.') 'ehT kciuq nworb xof depmuj revo eht yzal .god' >>> reverse_letters('Is this true?') 'sI siht ?eurt' >>> reverse_letters("I love Python") 'I evol nohtyP' """ return " ".join([word[::-1] for word in input_str.split()]) if __name__ == "__main__": import doctest doctest.testmod()
def reverse_letters(input_str: str) -> str: """ Reverses letters in a given string without adjusting the position of the words >>> reverse_letters('The cat in the hat') 'ehT tac ni eht tah' >>> reverse_letters('The quick brown fox jumped over the lazy dog.') 'ehT kciuq nworb xof depmuj revo eht yzal .god' >>> reverse_letters('Is this true?') 'sI siht ?eurt' >>> reverse_letters("I love Python") 'I evol nohtyP' """ return " ".join([word[::-1] for word in input_str.split()]) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import os import requests from bs4 import BeautifulSoup from fake_useragent import UserAgent headers = {"UserAgent": UserAgent().random} URL = "https://www.mywaifulist.moe/random" def save_image(image_url: str, image_title: str) -> None: """ Saves the image of anime character """ image = requests.get(image_url, headers=headers) with open(image_title, "wb") as file: file.write(image.content) def random_anime_character() -> tuple[str, str, str]: """ Returns the Title, Description, and Image Title of a random anime character . """ soup = BeautifulSoup(requests.get(URL, headers=headers).text, "html.parser") title = soup.find("meta", attrs={"property": "og:title"}).attrs["content"] image_url = soup.find("meta", attrs={"property": "og:image"}).attrs["content"] description = soup.find("p", id="description").get_text() _, image_extension = os.path.splitext(os.path.basename(image_url)) image_title = title.strip().replace(" ", "_") image_title = f"{image_title}{image_extension}" save_image(image_url, image_title) return (title, description, image_title) if __name__ == "__main__": title, desc, image_title = random_anime_character() print(f"{title}\n\n{desc}\n\nImage saved : {image_title}")
import os import requests from bs4 import BeautifulSoup from fake_useragent import UserAgent headers = {"UserAgent": UserAgent().random} URL = "https://www.mywaifulist.moe/random" def save_image(image_url: str, image_title: str) -> None: """ Saves the image of anime character """ image = requests.get(image_url, headers=headers) with open(image_title, "wb") as file: file.write(image.content) def random_anime_character() -> tuple[str, str, str]: """ Returns the Title, Description, and Image Title of a random anime character . """ soup = BeautifulSoup(requests.get(URL, headers=headers).text, "html.parser") title = soup.find("meta", attrs={"property": "og:title"}).attrs["content"] image_url = soup.find("meta", attrs={"property": "og:image"}).attrs["content"] description = soup.find("p", id="description").get_text() _, image_extension = os.path.splitext(os.path.basename(image_url)) image_title = title.strip().replace(" ", "_") image_title = f"{image_title}{image_extension}" save_image(image_url, image_title) return (title, description, image_title) if __name__ == "__main__": title, desc, image_title = random_anime_character() print(f"{title}\n\n{desc}\n\nImage saved : {image_title}")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/env python3 def climb_stairs(n: int) -> int: """ LeetCdoe No.70: Climbing Stairs Distinct ways to climb a n step staircase where each time you can either climb 1 or 2 steps. Args: n: number of steps of staircase Returns: Distinct ways to climb a n step staircase Raises: AssertionError: n not positive integer >>> climb_stairs(3) 3 >>> climb_stairs(1) 1 >>> climb_stairs(-7) # doctest: +ELLIPSIS Traceback (most recent call last): ... AssertionError: n needs to be positive integer, your input -7 """ assert ( isinstance(n, int) and n > 0 ), f"n needs to be positive integer, your input {n}" if n == 1: return 1 dp = [0] * (n + 1) dp[0], dp[1] = (1, 1) for i in range(2, n + 1): dp[i] = dp[i - 1] + dp[i - 2] return dp[n] if __name__ == "__main__": import doctest doctest.testmod()
#!/usr/bin/env python3 def climb_stairs(n: int) -> int: """ LeetCdoe No.70: Climbing Stairs Distinct ways to climb a n step staircase where each time you can either climb 1 or 2 steps. Args: n: number of steps of staircase Returns: Distinct ways to climb a n step staircase Raises: AssertionError: n not positive integer >>> climb_stairs(3) 3 >>> climb_stairs(1) 1 >>> climb_stairs(-7) # doctest: +ELLIPSIS Traceback (most recent call last): ... AssertionError: n needs to be positive integer, your input -7 """ assert ( isinstance(n, int) and n > 0 ), f"n needs to be positive integer, your input {n}" if n == 1: return 1 dp = [0] * (n + 1) dp[0], dp[1] = (1, 1) for i in range(2, n + 1): dp[i] = dp[i - 1] + dp[i - 2] return dp[n] if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Convert between different units of temperature """ def celsius_to_fahrenheit(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> celsius_to_fahrenheit(273.354, 3) 524.037 >>> celsius_to_fahrenheit(273.354, 0) 524.0 >>> celsius_to_fahrenheit(-40.0) -40.0 >>> celsius_to_fahrenheit(-20.0) -4.0 >>> celsius_to_fahrenheit(0) 32.0 >>> celsius_to_fahrenheit(20) 68.0 >>> celsius_to_fahrenheit("40") 104.0 >>> celsius_to_fahrenheit("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 32, ndigits) def celsius_to_kelvin(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> celsius_to_kelvin(273.354, 3) 546.504 >>> celsius_to_kelvin(273.354, 0) 547.0 >>> celsius_to_kelvin(0) 273.15 >>> celsius_to_kelvin(20.0) 293.15 >>> celsius_to_kelvin("40") 313.15 >>> celsius_to_kelvin("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round(float(celsius) + 273.15, ndigits) def celsius_to_rankine(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> celsius_to_rankine(273.354, 3) 983.707 >>> celsius_to_rankine(273.354, 0) 984.0 >>> celsius_to_rankine(0) 491.67 >>> celsius_to_rankine(20.0) 527.67 >>> celsius_to_rankine("40") 563.67 >>> celsius_to_rankine("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 491.67, ndigits) def fahrenheit_to_celsius(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> fahrenheit_to_celsius(273.354, 3) 134.086 >>> fahrenheit_to_celsius(273.354, 0) 134.0 >>> fahrenheit_to_celsius(0) -17.78 >>> fahrenheit_to_celsius(20.0) -6.67 >>> fahrenheit_to_celsius(40.0) 4.44 >>> fahrenheit_to_celsius(60) 15.56 >>> fahrenheit_to_celsius(80) 26.67 >>> fahrenheit_to_celsius("100") 37.78 >>> fahrenheit_to_celsius("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round((float(fahrenheit) - 32) * 5 / 9, ndigits) def fahrenheit_to_kelvin(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> fahrenheit_to_kelvin(273.354, 3) 407.236 >>> fahrenheit_to_kelvin(273.354, 0) 407.0 >>> fahrenheit_to_kelvin(0) 255.37 >>> fahrenheit_to_kelvin(20.0) 266.48 >>> fahrenheit_to_kelvin(40.0) 277.59 >>> fahrenheit_to_kelvin(60) 288.71 >>> fahrenheit_to_kelvin(80) 299.82 >>> fahrenheit_to_kelvin("100") 310.93 >>> fahrenheit_to_kelvin("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(((float(fahrenheit) - 32) * 5 / 9) + 273.15, ndigits) def fahrenheit_to_rankine(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> fahrenheit_to_rankine(273.354, 3) 733.024 >>> fahrenheit_to_rankine(273.354, 0) 733.0 >>> fahrenheit_to_rankine(0) 459.67 >>> fahrenheit_to_rankine(20.0) 479.67 >>> fahrenheit_to_rankine(40.0) 499.67 >>> fahrenheit_to_rankine(60) 519.67 >>> fahrenheit_to_rankine(80) 539.67 >>> fahrenheit_to_rankine("100") 559.67 >>> fahrenheit_to_rankine("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(float(fahrenheit) + 459.67, ndigits) def kelvin_to_celsius(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> kelvin_to_celsius(273.354, 3) 0.204 >>> kelvin_to_celsius(273.354, 0) 0.0 >>> kelvin_to_celsius(273.15) 0.0 >>> kelvin_to_celsius(300) 26.85 >>> kelvin_to_celsius("315.5") 42.35 >>> kelvin_to_celsius("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(float(kelvin) - 273.15, ndigits) def kelvin_to_fahrenheit(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> kelvin_to_fahrenheit(273.354, 3) 32.367 >>> kelvin_to_fahrenheit(273.354, 0) 32.0 >>> kelvin_to_fahrenheit(273.15) 32.0 >>> kelvin_to_fahrenheit(300) 80.33 >>> kelvin_to_fahrenheit("315.5") 108.23 >>> kelvin_to_fahrenheit("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(((float(kelvin) - 273.15) * 9 / 5) + 32, ndigits) def kelvin_to_rankine(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> kelvin_to_rankine(273.354, 3) 492.037 >>> kelvin_to_rankine(273.354, 0) 492.0 >>> kelvin_to_rankine(0) 0.0 >>> kelvin_to_rankine(20.0) 36.0 >>> kelvin_to_rankine("40") 72.0 >>> kelvin_to_rankine("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round((float(kelvin) * 9 / 5), ndigits) def rankine_to_celsius(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> rankine_to_celsius(273.354, 3) -121.287 >>> rankine_to_celsius(273.354, 0) -121.0 >>> rankine_to_celsius(273.15) -121.4 >>> rankine_to_celsius(300) -106.48 >>> rankine_to_celsius("315.5") -97.87 >>> rankine_to_celsius("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) - 491.67) * 5 / 9, ndigits) def rankine_to_fahrenheit(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> rankine_to_fahrenheit(273.15) -186.52 >>> rankine_to_fahrenheit(300) -159.67 >>> rankine_to_fahrenheit("315.5") -144.17 >>> rankine_to_fahrenheit("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round(float(rankine) - 459.67, ndigits) def rankine_to_kelvin(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> rankine_to_kelvin(0) 0.0 >>> rankine_to_kelvin(20.0) 11.11 >>> rankine_to_kelvin("40") 22.22 >>> rankine_to_kelvin("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) * 5 / 9), ndigits) def reaumur_to_kelvin(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to Kelvin and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_kelvin(0) 273.15 >>> reaumur_to_kelvin(20.0) 298.15 >>> reaumur_to_kelvin(40) 323.15 >>> reaumur_to_kelvin("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25 + 273.15), ndigits) def reaumur_to_fahrenheit(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to fahrenheit and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_fahrenheit(0) 32.0 >>> reaumur_to_fahrenheit(20.0) 77.0 >>> reaumur_to_fahrenheit(40) 122.0 >>> reaumur_to_fahrenheit("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32), ndigits) def reaumur_to_celsius(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to celsius and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_celsius(0) 0.0 >>> reaumur_to_celsius(20.0) 25.0 >>> reaumur_to_celsius(40) 50.0 >>> reaumur_to_celsius("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25), ndigits) def reaumur_to_rankine(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to rankine and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_rankine(0) 491.67 >>> reaumur_to_rankine(20.0) 536.67 >>> reaumur_to_rankine(40) 581.67 >>> reaumur_to_rankine("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32 + 459.67), ndigits) if __name__ == "__main__": import doctest doctest.testmod()
""" Convert between different units of temperature """ def celsius_to_fahrenheit(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> celsius_to_fahrenheit(273.354, 3) 524.037 >>> celsius_to_fahrenheit(273.354, 0) 524.0 >>> celsius_to_fahrenheit(-40.0) -40.0 >>> celsius_to_fahrenheit(-20.0) -4.0 >>> celsius_to_fahrenheit(0) 32.0 >>> celsius_to_fahrenheit(20) 68.0 >>> celsius_to_fahrenheit("40") 104.0 >>> celsius_to_fahrenheit("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 32, ndigits) def celsius_to_kelvin(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> celsius_to_kelvin(273.354, 3) 546.504 >>> celsius_to_kelvin(273.354, 0) 547.0 >>> celsius_to_kelvin(0) 273.15 >>> celsius_to_kelvin(20.0) 293.15 >>> celsius_to_kelvin("40") 313.15 >>> celsius_to_kelvin("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round(float(celsius) + 273.15, ndigits) def celsius_to_rankine(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> celsius_to_rankine(273.354, 3) 983.707 >>> celsius_to_rankine(273.354, 0) 984.0 >>> celsius_to_rankine(0) 491.67 >>> celsius_to_rankine(20.0) 527.67 >>> celsius_to_rankine("40") 563.67 >>> celsius_to_rankine("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 491.67, ndigits) def fahrenheit_to_celsius(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> fahrenheit_to_celsius(273.354, 3) 134.086 >>> fahrenheit_to_celsius(273.354, 0) 134.0 >>> fahrenheit_to_celsius(0) -17.78 >>> fahrenheit_to_celsius(20.0) -6.67 >>> fahrenheit_to_celsius(40.0) 4.44 >>> fahrenheit_to_celsius(60) 15.56 >>> fahrenheit_to_celsius(80) 26.67 >>> fahrenheit_to_celsius("100") 37.78 >>> fahrenheit_to_celsius("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round((float(fahrenheit) - 32) * 5 / 9, ndigits) def fahrenheit_to_kelvin(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> fahrenheit_to_kelvin(273.354, 3) 407.236 >>> fahrenheit_to_kelvin(273.354, 0) 407.0 >>> fahrenheit_to_kelvin(0) 255.37 >>> fahrenheit_to_kelvin(20.0) 266.48 >>> fahrenheit_to_kelvin(40.0) 277.59 >>> fahrenheit_to_kelvin(60) 288.71 >>> fahrenheit_to_kelvin(80) 299.82 >>> fahrenheit_to_kelvin("100") 310.93 >>> fahrenheit_to_kelvin("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(((float(fahrenheit) - 32) * 5 / 9) + 273.15, ndigits) def fahrenheit_to_rankine(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> fahrenheit_to_rankine(273.354, 3) 733.024 >>> fahrenheit_to_rankine(273.354, 0) 733.0 >>> fahrenheit_to_rankine(0) 459.67 >>> fahrenheit_to_rankine(20.0) 479.67 >>> fahrenheit_to_rankine(40.0) 499.67 >>> fahrenheit_to_rankine(60) 519.67 >>> fahrenheit_to_rankine(80) 539.67 >>> fahrenheit_to_rankine("100") 559.67 >>> fahrenheit_to_rankine("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(float(fahrenheit) + 459.67, ndigits) def kelvin_to_celsius(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> kelvin_to_celsius(273.354, 3) 0.204 >>> kelvin_to_celsius(273.354, 0) 0.0 >>> kelvin_to_celsius(273.15) 0.0 >>> kelvin_to_celsius(300) 26.85 >>> kelvin_to_celsius("315.5") 42.35 >>> kelvin_to_celsius("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(float(kelvin) - 273.15, ndigits) def kelvin_to_fahrenheit(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> kelvin_to_fahrenheit(273.354, 3) 32.367 >>> kelvin_to_fahrenheit(273.354, 0) 32.0 >>> kelvin_to_fahrenheit(273.15) 32.0 >>> kelvin_to_fahrenheit(300) 80.33 >>> kelvin_to_fahrenheit("315.5") 108.23 >>> kelvin_to_fahrenheit("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(((float(kelvin) - 273.15) * 9 / 5) + 32, ndigits) def kelvin_to_rankine(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> kelvin_to_rankine(273.354, 3) 492.037 >>> kelvin_to_rankine(273.354, 0) 492.0 >>> kelvin_to_rankine(0) 0.0 >>> kelvin_to_rankine(20.0) 36.0 >>> kelvin_to_rankine("40") 72.0 >>> kelvin_to_rankine("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round((float(kelvin) * 9 / 5), ndigits) def rankine_to_celsius(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> rankine_to_celsius(273.354, 3) -121.287 >>> rankine_to_celsius(273.354, 0) -121.0 >>> rankine_to_celsius(273.15) -121.4 >>> rankine_to_celsius(300) -106.48 >>> rankine_to_celsius("315.5") -97.87 >>> rankine_to_celsius("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) - 491.67) * 5 / 9, ndigits) def rankine_to_fahrenheit(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> rankine_to_fahrenheit(273.15) -186.52 >>> rankine_to_fahrenheit(300) -159.67 >>> rankine_to_fahrenheit("315.5") -144.17 >>> rankine_to_fahrenheit("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round(float(rankine) - 459.67, ndigits) def rankine_to_kelvin(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> rankine_to_kelvin(0) 0.0 >>> rankine_to_kelvin(20.0) 11.11 >>> rankine_to_kelvin("40") 22.22 >>> rankine_to_kelvin("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) * 5 / 9), ndigits) def reaumur_to_kelvin(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to Kelvin and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_kelvin(0) 273.15 >>> reaumur_to_kelvin(20.0) 298.15 >>> reaumur_to_kelvin(40) 323.15 >>> reaumur_to_kelvin("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25 + 273.15), ndigits) def reaumur_to_fahrenheit(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to fahrenheit and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_fahrenheit(0) 32.0 >>> reaumur_to_fahrenheit(20.0) 77.0 >>> reaumur_to_fahrenheit(40) 122.0 >>> reaumur_to_fahrenheit("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32), ndigits) def reaumur_to_celsius(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to celsius and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_celsius(0) 0.0 >>> reaumur_to_celsius(20.0) 25.0 >>> reaumur_to_celsius(40) 50.0 >>> reaumur_to_celsius("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25), ndigits) def reaumur_to_rankine(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to rankine and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_rankine(0) 491.67 >>> reaumur_to_rankine(20.0) 536.67 >>> reaumur_to_rankine(40) 581.67 >>> reaumur_to_rankine("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32 + 459.67), ndigits) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/env python3 """ Build a quantum circuit with pair or group of qubits to perform quantum entanglement. Quantum entanglement is a phenomenon observed at the quantum scale where entangled particles stay connected (in some sense) so that the actions performed on one of the particles affects the other, no matter the distance between two particles. """ import qiskit def quantum_entanglement(qubits: int = 2) -> qiskit.result.counts.Counts: """ # >>> quantum_entanglement(2) # {'00': 500, '11': 500} # ┌───┐ ┌─┐ # q_0: ┤ H ├──■──┤M├─── # └───┘┌─┴─┐└╥┘┌─┐ # q_1: ─────┤ X ├─╫─┤M├ # └───┘ ║ └╥┘ # c: 2/═══════════╩══╩═ # 0 1 Args: qubits (int): number of quibits to use. Defaults to 2 Returns: qiskit.result.counts.Counts: mapping of states to its counts """ classical_bits = qubits # Using Aer's qasm_simulator simulator = qiskit.Aer.get_backend("qasm_simulator") # Creating a Quantum Circuit acting on the q register circuit = qiskit.QuantumCircuit(qubits, classical_bits) # Adding a H gate on qubit 0 (now q0 in superposition) circuit.h(0) for i in range(1, qubits): # Adding CX (CNOT) gate circuit.cx(i - 1, i) # Mapping the quantum measurement to the classical bits circuit.measure(list(range(qubits)), list(range(classical_bits))) # Now measuring any one qubit would affect other qubits to collapse # their super position and have same state as the measured one. # Executing the circuit on the qasm simulator job = qiskit.execute(circuit, simulator, shots=1000) return job.result().get_counts(circuit) if __name__ == "__main__": print(f"Total count for various states are: {quantum_entanglement(3)}")
#!/usr/bin/env python3 """ Build a quantum circuit with pair or group of qubits to perform quantum entanglement. Quantum entanglement is a phenomenon observed at the quantum scale where entangled particles stay connected (in some sense) so that the actions performed on one of the particles affects the other, no matter the distance between two particles. """ import qiskit def quantum_entanglement(qubits: int = 2) -> qiskit.result.counts.Counts: """ # >>> quantum_entanglement(2) # {'00': 500, '11': 500} # ┌───┐ ┌─┐ # q_0: ┤ H ├──■──┤M├─── # └───┘┌─┴─┐└╥┘┌─┐ # q_1: ─────┤ X ├─╫─┤M├ # └───┘ ║ └╥┘ # c: 2/═══════════╩══╩═ # 0 1 Args: qubits (int): number of quibits to use. Defaults to 2 Returns: qiskit.result.counts.Counts: mapping of states to its counts """ classical_bits = qubits # Using Aer's qasm_simulator simulator = qiskit.Aer.get_backend("qasm_simulator") # Creating a Quantum Circuit acting on the q register circuit = qiskit.QuantumCircuit(qubits, classical_bits) # Adding a H gate on qubit 0 (now q0 in superposition) circuit.h(0) for i in range(1, qubits): # Adding CX (CNOT) gate circuit.cx(i - 1, i) # Mapping the quantum measurement to the classical bits circuit.measure(list(range(qubits)), list(range(classical_bits))) # Now measuring any one qubit would affect other qubits to collapse # their super position and have same state as the measured one. # Executing the circuit on the qasm simulator job = qiskit.execute(circuit, simulator, shots=1000) return job.result().get_counts(circuit) if __name__ == "__main__": print(f"Total count for various states are: {quantum_entanglement(3)}")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import numpy as np from PIL import Image def rgb2gray(rgb: np.array) -> np.array: """ Return gray image from rgb image >>> rgb2gray(np.array([[[127, 255, 0]]])) array([[187.6453]]) >>> rgb2gray(np.array([[[0, 0, 0]]])) array([[0.]]) >>> rgb2gray(np.array([[[2, 4, 1]]])) array([[3.0598]]) >>> rgb2gray(np.array([[[26, 255, 14], [5, 147, 20], [1, 200, 0]]])) array([[159.0524, 90.0635, 117.6989]]) """ r, g, b = rgb[:, :, 0], rgb[:, :, 1], rgb[:, :, 2] return 0.2989 * r + 0.5870 * g + 0.1140 * b def gray2binary(gray: np.array) -> np.array: """ Return binary image from gray image >>> gray2binary(np.array([[127, 255, 0]])) array([[False, True, False]]) >>> gray2binary(np.array([[0]])) array([[False]]) >>> gray2binary(np.array([[26.2409, 4.9315, 1.4729]])) array([[False, False, False]]) >>> gray2binary(np.array([[26, 255, 14], [5, 147, 20], [1, 200, 0]])) array([[False, True, False], [False, True, False], [False, True, False]]) """ return (127 < gray) & (gray <= 255) def dilation(image: np.array, kernel: np.array) -> np.array: """ Return dilated image >>> dilation(np.array([[True, False, True]]), np.array([[0, 1, 0]])) array([[False, False, False]]) >>> dilation(np.array([[False, False, True]]), np.array([[1, 0, 1]])) array([[False, False, False]]) """ output = np.zeros_like(image) image_padded = np.zeros( (image.shape[0] + kernel.shape[0] - 1, image.shape[1] + kernel.shape[1] - 1) ) # Copy image to padded image image_padded[kernel.shape[0] - 2 : -1 :, kernel.shape[1] - 2 : -1 :] = image # Iterate over image & apply kernel for x in range(image.shape[1]): for y in range(image.shape[0]): summation = ( kernel * image_padded[y : y + kernel.shape[0], x : x + kernel.shape[1]] ).sum() output[y, x] = int(summation > 0) return output # kernel to be applied structuring_element = np.array([[0, 1, 0], [1, 1, 1], [0, 1, 0]]) if __name__ == "__main__": # read original image image = np.array(Image.open(r"..\image_data\lena.jpg")) output = dilation(gray2binary(rgb2gray(image)), structuring_element) # Save the output image pil_img = Image.fromarray(output).convert("RGB") pil_img.save("result_dilation.png")
import numpy as np from PIL import Image def rgb2gray(rgb: np.array) -> np.array: """ Return gray image from rgb image >>> rgb2gray(np.array([[[127, 255, 0]]])) array([[187.6453]]) >>> rgb2gray(np.array([[[0, 0, 0]]])) array([[0.]]) >>> rgb2gray(np.array([[[2, 4, 1]]])) array([[3.0598]]) >>> rgb2gray(np.array([[[26, 255, 14], [5, 147, 20], [1, 200, 0]]])) array([[159.0524, 90.0635, 117.6989]]) """ r, g, b = rgb[:, :, 0], rgb[:, :, 1], rgb[:, :, 2] return 0.2989 * r + 0.5870 * g + 0.1140 * b def gray2binary(gray: np.array) -> np.array: """ Return binary image from gray image >>> gray2binary(np.array([[127, 255, 0]])) array([[False, True, False]]) >>> gray2binary(np.array([[0]])) array([[False]]) >>> gray2binary(np.array([[26.2409, 4.9315, 1.4729]])) array([[False, False, False]]) >>> gray2binary(np.array([[26, 255, 14], [5, 147, 20], [1, 200, 0]])) array([[False, True, False], [False, True, False], [False, True, False]]) """ return (127 < gray) & (gray <= 255) def dilation(image: np.array, kernel: np.array) -> np.array: """ Return dilated image >>> dilation(np.array([[True, False, True]]), np.array([[0, 1, 0]])) array([[False, False, False]]) >>> dilation(np.array([[False, False, True]]), np.array([[1, 0, 1]])) array([[False, False, False]]) """ output = np.zeros_like(image) image_padded = np.zeros( (image.shape[0] + kernel.shape[0] - 1, image.shape[1] + kernel.shape[1] - 1) ) # Copy image to padded image image_padded[kernel.shape[0] - 2 : -1 :, kernel.shape[1] - 2 : -1 :] = image # Iterate over image & apply kernel for x in range(image.shape[1]): for y in range(image.shape[0]): summation = ( kernel * image_padded[y : y + kernel.shape[0], x : x + kernel.shape[1]] ).sum() output[y, x] = int(summation > 0) return output # kernel to be applied structuring_element = np.array([[0, 1, 0], [1, 1, 1], [0, 1, 0]]) if __name__ == "__main__": # read original image image = np.array(Image.open(r"..\image_data\lena.jpg")) output = dilation(gray2binary(rgb2gray(image)), structuring_element) # Save the output image pil_img = Image.fromarray(output).convert("RGB") pil_img.save("result_dilation.png")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""For reference https://en.wikipedia.org/wiki/Odd%E2%80%93even_sort """ def odd_even_sort(input_list: list) -> list: """this algorithm uses the same idea of bubblesort, but by first dividing in two phase (odd and even). Originally developed for use on parallel processors with local interconnections. :param collection: mutable ordered sequence of elements :return: same collection in ascending order Examples: >>> odd_even_sort([5 , 4 ,3 ,2 ,1]) [1, 2, 3, 4, 5] >>> odd_even_sort([]) [] >>> odd_even_sort([-10 ,-1 ,10 ,2]) [-10, -1, 2, 10] >>> odd_even_sort([1 ,2 ,3 ,4]) [1, 2, 3, 4] """ sorted = False while sorted is False: # Until all the indices are traversed keep looping sorted = True for i in range(0, len(input_list) - 1, 2): # iterating over all even indices if input_list[i] > input_list[i + 1]: input_list[i], input_list[i + 1] = input_list[i + 1], input_list[i] # swapping if elements not in order sorted = False for i in range(1, len(input_list) - 1, 2): # iterating over all odd indices if input_list[i] > input_list[i + 1]: input_list[i], input_list[i + 1] = input_list[i + 1], input_list[i] # swapping if elements not in order sorted = False return input_list if __name__ == "__main__": print("Enter list to be sorted") input_list = [int(x) for x in input().split()] # inputing elements of the list in one line sorted_list = odd_even_sort(input_list) print("The sorted list is") print(sorted_list)
"""For reference https://en.wikipedia.org/wiki/Odd%E2%80%93even_sort """ def odd_even_sort(input_list: list) -> list: """this algorithm uses the same idea of bubblesort, but by first dividing in two phase (odd and even). Originally developed for use on parallel processors with local interconnections. :param collection: mutable ordered sequence of elements :return: same collection in ascending order Examples: >>> odd_even_sort([5 , 4 ,3 ,2 ,1]) [1, 2, 3, 4, 5] >>> odd_even_sort([]) [] >>> odd_even_sort([-10 ,-1 ,10 ,2]) [-10, -1, 2, 10] >>> odd_even_sort([1 ,2 ,3 ,4]) [1, 2, 3, 4] """ sorted = False while sorted is False: # Until all the indices are traversed keep looping sorted = True for i in range(0, len(input_list) - 1, 2): # iterating over all even indices if input_list[i] > input_list[i + 1]: input_list[i], input_list[i + 1] = input_list[i + 1], input_list[i] # swapping if elements not in order sorted = False for i in range(1, len(input_list) - 1, 2): # iterating over all odd indices if input_list[i] > input_list[i + 1]: input_list[i], input_list[i + 1] = input_list[i + 1], input_list[i] # swapping if elements not in order sorted = False return input_list if __name__ == "__main__": print("Enter list to be sorted") input_list = [int(x) for x in input().split()] # inputing elements of the list in one line sorted_list = odd_even_sort(input_list) print("The sorted list is") print(sorted_list)
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def kthPermutation(k, n): """ Finds k'th lexicographic permutation (in increasing order) of 0,1,2,...n-1 in O(n^2) time. Examples: First permutation is always 0,1,2,...n >>> kthPermutation(0,5) [0, 1, 2, 3, 4] The order of permutation of 0,1,2,3 is [0,1,2,3], [0,1,3,2], [0,2,1,3], [0,2,3,1], [0,3,1,2], [0,3,2,1], [1,0,2,3], [1,0,3,2], [1,2,0,3], [1,2,3,0], [1,3,0,2] >>> kthPermutation(10,4) [1, 3, 0, 2] """ # Factorails from 1! to (n-1)! factorials = [1] for i in range(2, n): factorials.append(factorials[-1] * i) assert 0 <= k < factorials[-1] * n, "k out of bounds" permutation = [] elements = list(range(n)) # Find permutation while factorials: factorial = factorials.pop() number, k = divmod(k, factorial) permutation.append(elements[number]) elements.remove(elements[number]) permutation.append(elements[0]) return permutation if __name__ == "__main__": import doctest doctest.testmod()
def kthPermutation(k, n): """ Finds k'th lexicographic permutation (in increasing order) of 0,1,2,...n-1 in O(n^2) time. Examples: First permutation is always 0,1,2,...n >>> kthPermutation(0,5) [0, 1, 2, 3, 4] The order of permutation of 0,1,2,3 is [0,1,2,3], [0,1,3,2], [0,2,1,3], [0,2,3,1], [0,3,1,2], [0,3,2,1], [1,0,2,3], [1,0,3,2], [1,2,0,3], [1,2,3,0], [1,3,0,2] >>> kthPermutation(10,4) [1, 3, 0, 2] """ # Factorails from 1! to (n-1)! factorials = [1] for i in range(2, n): factorials.append(factorials[-1] * i) assert 0 <= k < factorials[-1] * n, "k out of bounds" permutation = [] elements = list(range(n)) # Find permutation while factorials: factorial = factorials.pop() number, k = divmod(k, factorial) permutation.append(elements[number]) elements.remove(elements[number]) permutation.append(elements[0]) return permutation if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Finding longest distance in Directed Acyclic Graph using KahnsAlgorithm def longestDistance(graph): indegree = [0] * len(graph) queue = [] longDist = [1] * len(graph) for key, values in graph.items(): for i in values: indegree[i] += 1 for i in range(len(indegree)): if indegree[i] == 0: queue.append(i) while queue: vertex = queue.pop(0) for x in graph[vertex]: indegree[x] -= 1 if longDist[vertex] + 1 > longDist[x]: longDist[x] = longDist[vertex] + 1 if indegree[x] == 0: queue.append(x) print(max(longDist)) # Adjacency list of Graph graph = {0: [2, 3, 4], 1: [2, 7], 2: [5], 3: [5, 7], 4: [7], 5: [6], 6: [7], 7: []} longestDistance(graph)
# Finding longest distance in Directed Acyclic Graph using KahnsAlgorithm def longestDistance(graph): indegree = [0] * len(graph) queue = [] longDist = [1] * len(graph) for key, values in graph.items(): for i in values: indegree[i] += 1 for i in range(len(indegree)): if indegree[i] == 0: queue.append(i) while queue: vertex = queue.pop(0) for x in graph[vertex]: indegree[x] -= 1 if longDist[vertex] + 1 > longDist[x]: longDist[x] = longDist[vertex] + 1 if indegree[x] == 0: queue.append(x) print(max(longDist)) # Adjacency list of Graph graph = {0: [2, 3, 4], 1: [2, 7], 2: [5], 3: [5, 7], 4: [7], 5: [6], 6: [7], 7: []} longestDistance(graph)
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def actual_power(a: int, b: int): """ Function using divide and conquer to calculate a^b. It only works for integer a,b. """ if b == 0: return 1 if (b % 2) == 0: return actual_power(a, int(b / 2)) * actual_power(a, int(b / 2)) else: return a * actual_power(a, int(b / 2)) * actual_power(a, int(b / 2)) def power(a: int, b: int) -> float: """ >>> power(4,6) 4096 >>> power(2,3) 8 >>> power(-2,3) -8 >>> power(2,-3) 0.125 >>> power(-2,-3) -0.125 """ if b < 0: return 1 / actual_power(a, b) return actual_power(a, b) if __name__ == "__main__": print(power(-2, -3))
def actual_power(a: int, b: int): """ Function using divide and conquer to calculate a^b. It only works for integer a,b. """ if b == 0: return 1 if (b % 2) == 0: return actual_power(a, int(b / 2)) * actual_power(a, int(b / 2)) else: return a * actual_power(a, int(b / 2)) * actual_power(a, int(b / 2)) def power(a: int, b: int) -> float: """ >>> power(4,6) 4096 >>> power(2,3) 8 >>> power(-2,3) -8 >>> power(2,-3) 0.125 >>> power(-2,-3) -0.125 """ if b < 0: return 1 / actual_power(a, b) return actual_power(a, b) if __name__ == "__main__": print(power(-2, -3))
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Created on Fri Oct 16 09:31:07 2020 @author: Dr. Tobias Schröder @license: MIT-license This file contains the test-suite for the knapsack problem. """ import unittest from knapsack import knapsack as k class Test(unittest.TestCase): def test_base_case(self): """ test for the base case """ cap = 0 val = [0] w = [0] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 0) val = [60] w = [10] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 0) def test_easy_case(self): """ test for the base case """ cap = 3 val = [1, 2, 3] w = [3, 2, 1] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 5) def test_knapsack(self): """ test for the knapsack """ cap = 50 val = [60, 100, 120] w = [10, 20, 30] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 220) if __name__ == "__main__": unittest.main()
""" Created on Fri Oct 16 09:31:07 2020 @author: Dr. Tobias Schröder @license: MIT-license This file contains the test-suite for the knapsack problem. """ import unittest from knapsack import knapsack as k class Test(unittest.TestCase): def test_base_case(self): """ test for the base case """ cap = 0 val = [0] w = [0] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 0) val = [60] w = [10] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 0) def test_easy_case(self): """ test for the base case """ cap = 3 val = [1, 2, 3] w = [3, 2, 1] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 5) def test_knapsack(self): """ test for the knapsack """ cap = 50 val = [60, 100, 120] w = [10, 20, 30] c = len(val) self.assertEqual(k.knapsack(cap, w, val, c), 220) if __name__ == "__main__": unittest.main()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import pandas as pd from matplotlib import pyplot as plt from sklearn.linear_model import LinearRegression # Splitting the dataset into the Training set and Test set from sklearn.model_selection import train_test_split # Fitting Polynomial Regression to the dataset from sklearn.preprocessing import PolynomialFeatures # Importing the dataset dataset = pd.read_csv( "https://s3.us-west-2.amazonaws.com/public.gamelab.fun/dataset/" "position_salaries.csv" ) X = dataset.iloc[:, 1:2].values y = dataset.iloc[:, 2].values X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=0) poly_reg = PolynomialFeatures(degree=4) X_poly = poly_reg.fit_transform(X) pol_reg = LinearRegression() pol_reg.fit(X_poly, y) # Visualizing the Polymonial Regression results def viz_polymonial(): plt.scatter(X, y, color="red") plt.plot(X, pol_reg.predict(poly_reg.fit_transform(X)), color="blue") plt.title("Truth or Bluff (Linear Regression)") plt.xlabel("Position level") plt.ylabel("Salary") plt.show() return if __name__ == "__main__": viz_polymonial() # Predicting a new result with Polymonial Regression pol_reg.predict(poly_reg.fit_transform([[5.5]])) # output should be 132148.43750003
import pandas as pd from matplotlib import pyplot as plt from sklearn.linear_model import LinearRegression # Splitting the dataset into the Training set and Test set from sklearn.model_selection import train_test_split # Fitting Polynomial Regression to the dataset from sklearn.preprocessing import PolynomialFeatures # Importing the dataset dataset = pd.read_csv( "https://s3.us-west-2.amazonaws.com/public.gamelab.fun/dataset/" "position_salaries.csv" ) X = dataset.iloc[:, 1:2].values y = dataset.iloc[:, 2].values X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=0) poly_reg = PolynomialFeatures(degree=4) X_poly = poly_reg.fit_transform(X) pol_reg = LinearRegression() pol_reg.fit(X_poly, y) # Visualizing the Polymonial Regression results def viz_polymonial(): plt.scatter(X, y, color="red") plt.plot(X, pol_reg.predict(poly_reg.fit_transform(X)), color="blue") plt.title("Truth or Bluff (Linear Regression)") plt.xlabel("Position level") plt.ylabel("Salary") plt.show() return if __name__ == "__main__": viz_polymonial() # Predicting a new result with Polymonial Regression pol_reg.predict(poly_reg.fit_transform([[5.5]])) # output should be 132148.43750003
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Round Robin is a scheduling algorithm. In Round Robin each process is assigned a fixed time slot in a cyclic way. https://en.wikipedia.org/wiki/Round-robin_scheduling """ from __future__ import annotations from statistics import mean def calculate_waiting_times(burst_times: list[int]) -> list[int]: """ Calculate the waiting times of a list of processes that have a specified duration. Return: The waiting time for each process. >>> calculate_waiting_times([10, 5, 8]) [13, 10, 13] >>> calculate_waiting_times([4, 6, 3, 1]) [5, 8, 9, 6] >>> calculate_waiting_times([12, 2, 10]) [12, 2, 12] """ quantum = 2 rem_burst_times = list(burst_times) waiting_times = [0] * len(burst_times) t = 0 while True: done = True for i, burst_time in enumerate(burst_times): if rem_burst_times[i] > 0: done = False if rem_burst_times[i] > quantum: t += quantum rem_burst_times[i] -= quantum else: t += rem_burst_times[i] waiting_times[i] = t - burst_time rem_burst_times[i] = 0 if done is True: return waiting_times def calculate_turn_around_times( burst_times: list[int], waiting_times: list[int] ) -> list[int]: """ >>> calculate_turn_around_times([1, 2, 3, 4], [0, 1, 3]) [1, 3, 6] >>> calculate_turn_around_times([10, 3, 7], [10, 6, 11]) [20, 9, 18] """ return [burst + waiting for burst, waiting in zip(burst_times, waiting_times)] if __name__ == "__main__": burst_times = [3, 5, 7] waiting_times = calculate_waiting_times(burst_times) turn_around_times = calculate_turn_around_times(burst_times, waiting_times) print("Process ID \tBurst Time \tWaiting Time \tTurnaround Time") for i, burst_time in enumerate(burst_times): print( f" {i + 1}\t\t {burst_time}\t\t {waiting_times[i]}\t\t " f"{turn_around_times[i]}" ) print(f"\nAverage waiting time = {mean(waiting_times):.5f}") print(f"Average turn around time = {mean(turn_around_times):.5f}")
""" Round Robin is a scheduling algorithm. In Round Robin each process is assigned a fixed time slot in a cyclic way. https://en.wikipedia.org/wiki/Round-robin_scheduling """ from __future__ import annotations from statistics import mean def calculate_waiting_times(burst_times: list[int]) -> list[int]: """ Calculate the waiting times of a list of processes that have a specified duration. Return: The waiting time for each process. >>> calculate_waiting_times([10, 5, 8]) [13, 10, 13] >>> calculate_waiting_times([4, 6, 3, 1]) [5, 8, 9, 6] >>> calculate_waiting_times([12, 2, 10]) [12, 2, 12] """ quantum = 2 rem_burst_times = list(burst_times) waiting_times = [0] * len(burst_times) t = 0 while True: done = True for i, burst_time in enumerate(burst_times): if rem_burst_times[i] > 0: done = False if rem_burst_times[i] > quantum: t += quantum rem_burst_times[i] -= quantum else: t += rem_burst_times[i] waiting_times[i] = t - burst_time rem_burst_times[i] = 0 if done is True: return waiting_times def calculate_turn_around_times( burst_times: list[int], waiting_times: list[int] ) -> list[int]: """ >>> calculate_turn_around_times([1, 2, 3, 4], [0, 1, 3]) [1, 3, 6] >>> calculate_turn_around_times([10, 3, 7], [10, 6, 11]) [20, 9, 18] """ return [burst + waiting for burst, waiting in zip(burst_times, waiting_times)] if __name__ == "__main__": burst_times = [3, 5, 7] waiting_times = calculate_waiting_times(burst_times) turn_around_times = calculate_turn_around_times(burst_times, waiting_times) print("Process ID \tBurst Time \tWaiting Time \tTurnaround Time") for i, burst_time in enumerate(burst_times): print( f" {i + 1}\t\t {burst_time}\t\t {waiting_times[i]}\t\t " f"{turn_around_times[i]}" ) print(f"\nAverage waiting time = {mean(waiting_times):.5f}") print(f"Average turn around time = {mean(turn_around_times):.5f}")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations from typing import Generic, TypeVar T = TypeVar("T") class StackOverflowError(BaseException): pass class StackUnderflowError(BaseException): pass class Stack(Generic[T]): """A stack is an abstract data type that serves as a collection of elements with two principal operations: push() and pop(). push() adds an element to the top of the stack, and pop() removes an element from the top of a stack. The order in which elements come off of a stack are Last In, First Out (LIFO). https://en.wikipedia.org/wiki/Stack_(abstract_data_type) """ def __init__(self, limit: int = 10): self.stack: list[T] = [] self.limit = limit def __bool__(self) -> bool: return bool(self.stack) def __str__(self) -> str: return str(self.stack) def push(self, data: T) -> None: """Push an element to the top of the stack.""" if len(self.stack) >= self.limit: raise StackOverflowError self.stack.append(data) def pop(self) -> T: """ Pop an element off of the top of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack.pop() def peek(self) -> T: """ Peek at the top-most element of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack[-1] def is_empty(self) -> bool: """Check if a stack is empty.""" return not bool(self.stack) def is_full(self) -> bool: return self.size() == self.limit def size(self) -> int: """Return the size of the stack.""" return len(self.stack) def __contains__(self, item: T) -> bool: """Check if item is in stack""" return item in self.stack def test_stack() -> None: """ >>> test_stack() """ stack: Stack[int] = Stack(10) assert bool(stack) is False assert stack.is_empty() is True assert stack.is_full() is False assert str(stack) == "[]" try: _ = stack.pop() assert False # This should not happen except StackUnderflowError: assert True # This should happen try: _ = stack.peek() assert False # This should not happen except StackUnderflowError: assert True # This should happen for i in range(10): assert stack.size() == i stack.push(i) assert bool(stack) assert not stack.is_empty() assert stack.is_full() assert str(stack) == str(list(range(10))) assert stack.pop() == 9 assert stack.peek() == 8 stack.push(100) assert str(stack) == str([0, 1, 2, 3, 4, 5, 6, 7, 8, 100]) try: stack.push(200) assert False # This should not happen except StackOverflowError: assert True # This should happen assert not stack.is_empty() assert stack.size() == 10 assert 5 in stack assert 55 not in stack if __name__ == "__main__": test_stack()
from __future__ import annotations from typing import Generic, TypeVar T = TypeVar("T") class StackOverflowError(BaseException): pass class StackUnderflowError(BaseException): pass class Stack(Generic[T]): """A stack is an abstract data type that serves as a collection of elements with two principal operations: push() and pop(). push() adds an element to the top of the stack, and pop() removes an element from the top of a stack. The order in which elements come off of a stack are Last In, First Out (LIFO). https://en.wikipedia.org/wiki/Stack_(abstract_data_type) """ def __init__(self, limit: int = 10): self.stack: list[T] = [] self.limit = limit def __bool__(self) -> bool: return bool(self.stack) def __str__(self) -> str: return str(self.stack) def push(self, data: T) -> None: """Push an element to the top of the stack.""" if len(self.stack) >= self.limit: raise StackOverflowError self.stack.append(data) def pop(self) -> T: """ Pop an element off of the top of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack.pop() def peek(self) -> T: """ Peek at the top-most element of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack[-1] def is_empty(self) -> bool: """Check if a stack is empty.""" return not bool(self.stack) def is_full(self) -> bool: return self.size() == self.limit def size(self) -> int: """Return the size of the stack.""" return len(self.stack) def __contains__(self, item: T) -> bool: """Check if item is in stack""" return item in self.stack def test_stack() -> None: """ >>> test_stack() """ stack: Stack[int] = Stack(10) assert bool(stack) is False assert stack.is_empty() is True assert stack.is_full() is False assert str(stack) == "[]" try: _ = stack.pop() assert False # This should not happen except StackUnderflowError: assert True # This should happen try: _ = stack.peek() assert False # This should not happen except StackUnderflowError: assert True # This should happen for i in range(10): assert stack.size() == i stack.push(i) assert bool(stack) assert not stack.is_empty() assert stack.is_full() assert str(stack) == str(list(range(10))) assert stack.pop() == 9 assert stack.peek() == 8 stack.push(100) assert str(stack) == str([0, 1, 2, 3, 4, 5, 6, 7, 8, 100]) try: stack.push(200) assert False # This should not happen except StackOverflowError: assert True # This should happen assert not stack.is_empty() assert stack.size() == 10 assert 5 in stack assert 55 not in stack if __name__ == "__main__": test_stack()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Convert Base 10 (Decimal) Values to Hexadecimal Representations """ # set decimal value for each hexadecimal digit values = { 0: "0", 1: "1", 2: "2", 3: "3", 4: "4", 5: "5", 6: "6", 7: "7", 8: "8", 9: "9", 10: "a", 11: "b", 12: "c", 13: "d", 14: "e", 15: "f", } def decimal_to_hexadecimal(decimal: float) -> str: """ take integer decimal value, return hexadecimal representation as str beginning with 0x >>> decimal_to_hexadecimal(5) '0x5' >>> decimal_to_hexadecimal(15) '0xf' >>> decimal_to_hexadecimal(37) '0x25' >>> decimal_to_hexadecimal(255) '0xff' >>> decimal_to_hexadecimal(4096) '0x1000' >>> decimal_to_hexadecimal(999098) '0xf3eba' >>> # negatives work too >>> decimal_to_hexadecimal(-256) '-0x100' >>> # floats are acceptable if equivalent to an int >>> decimal_to_hexadecimal(17.0) '0x11' >>> # other floats will error >>> decimal_to_hexadecimal(16.16) # doctest: +ELLIPSIS Traceback (most recent call last): ... AssertionError >>> # strings will error as well >>> decimal_to_hexadecimal('0xfffff') # doctest: +ELLIPSIS Traceback (most recent call last): ... AssertionError >>> # results are the same when compared to Python's default hex function >>> decimal_to_hexadecimal(-256) == hex(-256) True """ assert type(decimal) in (int, float) and decimal == int(decimal) decimal = int(decimal) hexadecimal = "" negative = False if decimal < 0: negative = True decimal *= -1 while decimal > 0: decimal, remainder = divmod(decimal, 16) hexadecimal = values[remainder] + hexadecimal hexadecimal = "0x" + hexadecimal if negative: hexadecimal = "-" + hexadecimal return hexadecimal if __name__ == "__main__": import doctest doctest.testmod()
""" Convert Base 10 (Decimal) Values to Hexadecimal Representations """ # set decimal value for each hexadecimal digit values = { 0: "0", 1: "1", 2: "2", 3: "3", 4: "4", 5: "5", 6: "6", 7: "7", 8: "8", 9: "9", 10: "a", 11: "b", 12: "c", 13: "d", 14: "e", 15: "f", } def decimal_to_hexadecimal(decimal: float) -> str: """ take integer decimal value, return hexadecimal representation as str beginning with 0x >>> decimal_to_hexadecimal(5) '0x5' >>> decimal_to_hexadecimal(15) '0xf' >>> decimal_to_hexadecimal(37) '0x25' >>> decimal_to_hexadecimal(255) '0xff' >>> decimal_to_hexadecimal(4096) '0x1000' >>> decimal_to_hexadecimal(999098) '0xf3eba' >>> # negatives work too >>> decimal_to_hexadecimal(-256) '-0x100' >>> # floats are acceptable if equivalent to an int >>> decimal_to_hexadecimal(17.0) '0x11' >>> # other floats will error >>> decimal_to_hexadecimal(16.16) # doctest: +ELLIPSIS Traceback (most recent call last): ... AssertionError >>> # strings will error as well >>> decimal_to_hexadecimal('0xfffff') # doctest: +ELLIPSIS Traceback (most recent call last): ... AssertionError >>> # results are the same when compared to Python's default hex function >>> decimal_to_hexadecimal(-256) == hex(-256) True """ assert type(decimal) in (int, float) and decimal == int(decimal) decimal = int(decimal) hexadecimal = "" negative = False if decimal < 0: negative = True decimal *= -1 while decimal > 0: decimal, remainder = divmod(decimal, 16) hexadecimal = values[remainder] + hexadecimal hexadecimal = "0x" + hexadecimal if negative: hexadecimal = "-" + hexadecimal return hexadecimal if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def quick_sort(data: list) -> list: """ >>> for data in ([2, 1, 0], [2.2, 1.1, 0], "quick_sort"): ... quick_sort(data) == sorted(data) True True True """ if len(data) <= 1: return data else: return ( quick_sort([e for e in data[1:] if e <= data[0]]) + [data[0]] + quick_sort([e for e in data[1:] if e > data[0]]) ) if __name__ == "__main__": import doctest doctest.testmod()
def quick_sort(data: list) -> list: """ >>> for data in ([2, 1, 0], [2.2, 1.1, 0], "quick_sort"): ... quick_sort(data) == sorted(data) True True True """ if len(data) <= 1: return data else: return ( quick_sort([e for e in data[1:] if e <= data[0]]) + [data[0]] + quick_sort([e for e in data[1:] if e > data[0]]) ) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" By starting at the top of the triangle below and moving to adjacent numbers on the row below, the maximum total from top to bottom is 23. 3 7 4 2 4 6 8 5 9 3 That is, 3 + 7 + 4 + 9 = 23. Find the maximum total from top to bottom of the triangle below: 75 95 64 17 47 82 18 35 87 10 20 04 82 47 65 19 01 23 75 03 34 88 02 77 73 07 63 67 99 65 04 28 06 16 70 92 41 41 26 56 83 40 80 70 33 41 48 72 33 47 32 37 16 94 29 53 71 44 65 25 43 91 52 97 51 14 70 11 33 28 77 73 17 78 39 68 17 57 91 71 52 38 17 14 91 43 58 50 27 29 48 63 66 04 68 89 53 67 30 73 16 69 87 40 31 04 62 98 27 23 09 70 98 73 93 38 53 60 04 23 """ import os def solution(): """ Finds the maximum total in a triangle as described by the problem statement above. >>> solution() 1074 """ script_dir = os.path.dirname(os.path.realpath(__file__)) triangle = os.path.join(script_dir, "triangle.txt") with open(triangle) as f: triangle = f.readlines() a = [[int(y) for y in x.rstrip("\r\n").split(" ")] for x in triangle] for i in range(1, len(a)): for j in range(len(a[i])): if j != len(a[i - 1]): number1 = a[i - 1][j] else: number1 = 0 if j > 0: number2 = a[i - 1][j - 1] else: number2 = 0 a[i][j] += max(number1, number2) return max(a[-1]) if __name__ == "__main__": print(solution())
""" By starting at the top of the triangle below and moving to adjacent numbers on the row below, the maximum total from top to bottom is 23. 3 7 4 2 4 6 8 5 9 3 That is, 3 + 7 + 4 + 9 = 23. Find the maximum total from top to bottom of the triangle below: 75 95 64 17 47 82 18 35 87 10 20 04 82 47 65 19 01 23 75 03 34 88 02 77 73 07 63 67 99 65 04 28 06 16 70 92 41 41 26 56 83 40 80 70 33 41 48 72 33 47 32 37 16 94 29 53 71 44 65 25 43 91 52 97 51 14 70 11 33 28 77 73 17 78 39 68 17 57 91 71 52 38 17 14 91 43 58 50 27 29 48 63 66 04 68 89 53 67 30 73 16 69 87 40 31 04 62 98 27 23 09 70 98 73 93 38 53 60 04 23 """ import os def solution(): """ Finds the maximum total in a triangle as described by the problem statement above. >>> solution() 1074 """ script_dir = os.path.dirname(os.path.realpath(__file__)) triangle = os.path.join(script_dir, "triangle.txt") with open(triangle) as f: triangle = f.readlines() a = [[int(y) for y in x.rstrip("\r\n").split(" ")] for x in triangle] for i in range(1, len(a)): for j in range(len(a[i])): if j != len(a[i - 1]): number1 = a[i - 1][j] else: number1 = 0 if j > 0: number2 = a[i - 1][j - 1] else: number2 = 0 a[i][j] += max(number1, number2) return max(a[-1]) if __name__ == "__main__": print(solution())
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" This is a pure Python implementation of the Harmonic Series algorithm https://en.wikipedia.org/wiki/Harmonic_series_(mathematics) For doctests run following command: python -m doctest -v harmonic_series.py or python3 -m doctest -v harmonic_series.py For manual testing run: python3 harmonic_series.py """ def harmonic_series(n_term: str) -> list: """Pure Python implementation of Harmonic Series algorithm :param n_term: The last (nth) term of Harmonic Series :return: The Harmonic Series starting from 1 to last (nth) term Examples: >>> harmonic_series(5) ['1', '1/2', '1/3', '1/4', '1/5'] >>> harmonic_series(5.0) ['1', '1/2', '1/3', '1/4', '1/5'] >>> harmonic_series(5.1) ['1', '1/2', '1/3', '1/4', '1/5'] >>> harmonic_series(-5) [] >>> harmonic_series(0) [] >>> harmonic_series(1) ['1'] """ if n_term == "": return [] series: list = [] for temp in range(int(n_term)): series.append(f"1/{temp + 1}" if series else "1") return series if __name__ == "__main__": nth_term = input("Enter the last number (nth term) of the Harmonic Series") print("Formula of Harmonic Series => 1+1/2+1/3 ..... 1/n") print(harmonic_series(nth_term))
""" This is a pure Python implementation of the Harmonic Series algorithm https://en.wikipedia.org/wiki/Harmonic_series_(mathematics) For doctests run following command: python -m doctest -v harmonic_series.py or python3 -m doctest -v harmonic_series.py For manual testing run: python3 harmonic_series.py """ def harmonic_series(n_term: str) -> list: """Pure Python implementation of Harmonic Series algorithm :param n_term: The last (nth) term of Harmonic Series :return: The Harmonic Series starting from 1 to last (nth) term Examples: >>> harmonic_series(5) ['1', '1/2', '1/3', '1/4', '1/5'] >>> harmonic_series(5.0) ['1', '1/2', '1/3', '1/4', '1/5'] >>> harmonic_series(5.1) ['1', '1/2', '1/3', '1/4', '1/5'] >>> harmonic_series(-5) [] >>> harmonic_series(0) [] >>> harmonic_series(1) ['1'] """ if n_term == "": return [] series: list = [] for temp in range(int(n_term)): series.append(f"1/{temp + 1}" if series else "1") return series if __name__ == "__main__": nth_term = input("Enter the last number (nth term) of the Harmonic Series") print("Formula of Harmonic Series => 1+1/2+1/3 ..... 1/n") print(harmonic_series(nth_term))
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Conversion of pressure units. Available Units:- Pascal,Bar,Kilopascal,Megapascal,psi(pound per square inch), inHg(in mercury column),torr,atm USAGE : -> Import this file into their respective project. -> Use the function pressure_conversion() for conversion of pressure units. -> Parameters : -> value : The number of from units you want to convert -> from_type : From which type you want to convert -> to_type : To which type you want to convert REFERENCES : -> Wikipedia reference: https://en.wikipedia.org/wiki/Pascal_(unit) -> Wikipedia reference: https://en.wikipedia.org/wiki/Pound_per_square_inch -> Wikipedia reference: https://en.wikipedia.org/wiki/Inch_of_mercury -> Wikipedia reference: https://en.wikipedia.org/wiki/Torr -> https://en.wikipedia.org/wiki/Standard_atmosphere_(unit) -> https://msestudent.com/what-are-the-units-of-pressure/ -> https://www.unitconverters.net/pressure-converter.html """ from collections import namedtuple from_to = namedtuple("from_to", "from_ to") PRESSURE_CONVERSION = { "atm": from_to(1, 1), "pascal": from_to(0.0000098, 101325), "bar": from_to(0.986923, 1.01325), "kilopascal": from_to(0.00986923, 101.325), "megapascal": from_to(9.86923, 0.101325), "psi": from_to(0.068046, 14.6959), "inHg": from_to(0.0334211, 29.9213), "torr": from_to(0.00131579, 760), } def pressure_conversion(value: float, from_type: str, to_type: str) -> float: """ Conversion between pressure units. >>> pressure_conversion(4, "atm", "pascal") 405300 >>> pressure_conversion(1, "pascal", "psi") 0.00014401981999999998 >>> pressure_conversion(1, "bar", "atm") 0.986923 >>> pressure_conversion(3, "kilopascal", "bar") 0.029999991892499998 >>> pressure_conversion(2, "megapascal", "psi") 290.074434314 >>> pressure_conversion(4, "psi", "torr") 206.85984 >>> pressure_conversion(1, "inHg", "atm") 0.0334211 >>> pressure_conversion(1, "torr", "psi") 0.019336718261000002 >>> pressure_conversion(4, "wrongUnit", "atm") Traceback (most recent call last): File "/usr/lib/python3.8/doctest.py", line 1336, in __run exec(compile(example.source, filename, "single", File "<doctest __main__.pressure_conversion[8]>", line 1, in <module> pressure_conversion(4, "wrongUnit", "atm") File "<string>", line 67, in pressure_conversion ValueError: Invalid 'from_type' value: 'wrongUnit' Supported values are: atm, pascal, bar, kilopascal, megapascal, psi, inHg, torr """ if from_type not in PRESSURE_CONVERSION: raise ValueError( f"Invalid 'from_type' value: {from_type!r} Supported values are:\n" + ", ".join(PRESSURE_CONVERSION) ) if to_type not in PRESSURE_CONVERSION: raise ValueError( f"Invalid 'to_type' value: {to_type!r}. Supported values are:\n" + ", ".join(PRESSURE_CONVERSION) ) return ( value * PRESSURE_CONVERSION[from_type].from_ * PRESSURE_CONVERSION[to_type].to ) if __name__ == "__main__": import doctest doctest.testmod()
""" Conversion of pressure units. Available Units:- Pascal,Bar,Kilopascal,Megapascal,psi(pound per square inch), inHg(in mercury column),torr,atm USAGE : -> Import this file into their respective project. -> Use the function pressure_conversion() for conversion of pressure units. -> Parameters : -> value : The number of from units you want to convert -> from_type : From which type you want to convert -> to_type : To which type you want to convert REFERENCES : -> Wikipedia reference: https://en.wikipedia.org/wiki/Pascal_(unit) -> Wikipedia reference: https://en.wikipedia.org/wiki/Pound_per_square_inch -> Wikipedia reference: https://en.wikipedia.org/wiki/Inch_of_mercury -> Wikipedia reference: https://en.wikipedia.org/wiki/Torr -> https://en.wikipedia.org/wiki/Standard_atmosphere_(unit) -> https://msestudent.com/what-are-the-units-of-pressure/ -> https://www.unitconverters.net/pressure-converter.html """ from collections import namedtuple from_to = namedtuple("from_to", "from_ to") PRESSURE_CONVERSION = { "atm": from_to(1, 1), "pascal": from_to(0.0000098, 101325), "bar": from_to(0.986923, 1.01325), "kilopascal": from_to(0.00986923, 101.325), "megapascal": from_to(9.86923, 0.101325), "psi": from_to(0.068046, 14.6959), "inHg": from_to(0.0334211, 29.9213), "torr": from_to(0.00131579, 760), } def pressure_conversion(value: float, from_type: str, to_type: str) -> float: """ Conversion between pressure units. >>> pressure_conversion(4, "atm", "pascal") 405300 >>> pressure_conversion(1, "pascal", "psi") 0.00014401981999999998 >>> pressure_conversion(1, "bar", "atm") 0.986923 >>> pressure_conversion(3, "kilopascal", "bar") 0.029999991892499998 >>> pressure_conversion(2, "megapascal", "psi") 290.074434314 >>> pressure_conversion(4, "psi", "torr") 206.85984 >>> pressure_conversion(1, "inHg", "atm") 0.0334211 >>> pressure_conversion(1, "torr", "psi") 0.019336718261000002 >>> pressure_conversion(4, "wrongUnit", "atm") Traceback (most recent call last): File "/usr/lib/python3.8/doctest.py", line 1336, in __run exec(compile(example.source, filename, "single", File "<doctest __main__.pressure_conversion[8]>", line 1, in <module> pressure_conversion(4, "wrongUnit", "atm") File "<string>", line 67, in pressure_conversion ValueError: Invalid 'from_type' value: 'wrongUnit' Supported values are: atm, pascal, bar, kilopascal, megapascal, psi, inHg, torr """ if from_type not in PRESSURE_CONVERSION: raise ValueError( f"Invalid 'from_type' value: {from_type!r} Supported values are:\n" + ", ".join(PRESSURE_CONVERSION) ) if to_type not in PRESSURE_CONVERSION: raise ValueError( f"Invalid 'to_type' value: {to_type!r}. Supported values are:\n" + ", ".join(PRESSURE_CONVERSION) ) return ( value * PRESSURE_CONVERSION[from_type].from_ * PRESSURE_CONVERSION[to_type].to ) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" The algorithm finds the pattern in given text using following rule. The bad-character rule considers the mismatched character in Text. The next occurrence of that character to the left in Pattern is found, If the mismatched character occurs to the left in Pattern, a shift is proposed that aligns text block and pattern. If the mismatched character does not occur to the left in Pattern, a shift is proposed that moves the entirety of Pattern past the point of mismatch in the text. If there no mismatch then the pattern matches with text block. Time Complexity : O(n/m) n=length of main string m=length of pattern string """ from __future__ import annotations class BoyerMooreSearch: def __init__(self, text: str, pattern: str): self.text, self.pattern = text, pattern self.textLen, self.patLen = len(text), len(pattern) def match_in_pattern(self, char: str) -> int: """finds the index of char in pattern in reverse order Parameters : char (chr): character to be searched Returns : i (int): index of char from last in pattern -1 (int): if char is not found in pattern """ for i in range(self.patLen - 1, -1, -1): if char == self.pattern[i]: return i return -1 def mismatch_in_text(self, currentPos: int) -> int: """ find the index of mis-matched character in text when compared with pattern from last Parameters : currentPos (int): current index position of text Returns : i (int): index of mismatched char from last in text -1 (int): if there is no mismatch between pattern and text block """ for i in range(self.patLen - 1, -1, -1): if self.pattern[i] != self.text[currentPos + i]: return currentPos + i return -1 def bad_character_heuristic(self) -> list[int]: # searches pattern in text and returns index positions positions = [] for i in range(self.textLen - self.patLen + 1): mismatch_index = self.mismatch_in_text(i) if mismatch_index == -1: positions.append(i) else: match_index = self.match_in_pattern(self.text[mismatch_index]) i = ( mismatch_index - match_index ) # shifting index lgtm [py/multiple-definition] return positions text = "ABAABA" pattern = "AB" bms = BoyerMooreSearch(text, pattern) positions = bms.bad_character_heuristic() if len(positions) == 0: print("No match found") else: print("Pattern found in following positions: ") print(positions)
""" The algorithm finds the pattern in given text using following rule. The bad-character rule considers the mismatched character in Text. The next occurrence of that character to the left in Pattern is found, If the mismatched character occurs to the left in Pattern, a shift is proposed that aligns text block and pattern. If the mismatched character does not occur to the left in Pattern, a shift is proposed that moves the entirety of Pattern past the point of mismatch in the text. If there no mismatch then the pattern matches with text block. Time Complexity : O(n/m) n=length of main string m=length of pattern string """ from __future__ import annotations class BoyerMooreSearch: def __init__(self, text: str, pattern: str): self.text, self.pattern = text, pattern self.textLen, self.patLen = len(text), len(pattern) def match_in_pattern(self, char: str) -> int: """finds the index of char in pattern in reverse order Parameters : char (chr): character to be searched Returns : i (int): index of char from last in pattern -1 (int): if char is not found in pattern """ for i in range(self.patLen - 1, -1, -1): if char == self.pattern[i]: return i return -1 def mismatch_in_text(self, currentPos: int) -> int: """ find the index of mis-matched character in text when compared with pattern from last Parameters : currentPos (int): current index position of text Returns : i (int): index of mismatched char from last in text -1 (int): if there is no mismatch between pattern and text block """ for i in range(self.patLen - 1, -1, -1): if self.pattern[i] != self.text[currentPos + i]: return currentPos + i return -1 def bad_character_heuristic(self) -> list[int]: # searches pattern in text and returns index positions positions = [] for i in range(self.textLen - self.patLen + 1): mismatch_index = self.mismatch_in_text(i) if mismatch_index == -1: positions.append(i) else: match_index = self.match_in_pattern(self.text[mismatch_index]) i = ( mismatch_index - match_index ) # shifting index lgtm [py/multiple-definition] return positions text = "ABAABA" pattern = "AB" bms = BoyerMooreSearch(text, pattern) positions = bms.bad_character_heuristic() if len(positions) == 0: print("No match found") else: print("Pattern found in following positions: ") print(positions)
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def apply_table(inp, table): """ >>> apply_table("0123456789", list(range(10))) '9012345678' >>> apply_table("0123456789", list(range(9, -1, -1))) '8765432109' """ res = "" for i in table: res += inp[i - 1] return res def left_shift(data): """ >>> left_shift("0123456789") '1234567890' """ return data[1:] + data[0] def XOR(a, b): """ >>> XOR("01010101", "00001111") '01011010' """ res = "" for i in range(len(a)): if a[i] == b[i]: res += "0" else: res += "1" return res def apply_sbox(s, data): row = int("0b" + data[0] + data[-1], 2) col = int("0b" + data[1:3], 2) return bin(s[row][col])[2:] def function(expansion, s0, s1, key, message): left = message[:4] right = message[4:] temp = apply_table(right, expansion) temp = XOR(temp, key) l = apply_sbox(s0, temp[:4]) # noqa: E741 r = apply_sbox(s1, temp[4:]) l = "0" * (2 - len(l)) + l # noqa: E741 r = "0" * (2 - len(r)) + r temp = apply_table(l + r, p4_table) temp = XOR(left, temp) return temp + right if __name__ == "__main__": key = input("Enter 10 bit key: ") message = input("Enter 8 bit message: ") p8_table = [6, 3, 7, 4, 8, 5, 10, 9] p10_table = [3, 5, 2, 7, 4, 10, 1, 9, 8, 6] p4_table = [2, 4, 3, 1] IP = [2, 6, 3, 1, 4, 8, 5, 7] IP_inv = [4, 1, 3, 5, 7, 2, 8, 6] expansion = [4, 1, 2, 3, 2, 3, 4, 1] s0 = [[1, 0, 3, 2], [3, 2, 1, 0], [0, 2, 1, 3], [3, 1, 3, 2]] s1 = [[0, 1, 2, 3], [2, 0, 1, 3], [3, 0, 1, 0], [2, 1, 0, 3]] # key generation temp = apply_table(key, p10_table) left = temp[:5] right = temp[5:] left = left_shift(left) right = left_shift(right) key1 = apply_table(left + right, p8_table) left = left_shift(left) right = left_shift(right) left = left_shift(left) right = left_shift(right) key2 = apply_table(left + right, p8_table) # encryption temp = apply_table(message, IP) temp = function(expansion, s0, s1, key1, temp) temp = temp[4:] + temp[:4] temp = function(expansion, s0, s1, key2, temp) CT = apply_table(temp, IP_inv) print("Cipher text is:", CT) # decryption temp = apply_table(CT, IP) temp = function(expansion, s0, s1, key2, temp) temp = temp[4:] + temp[:4] temp = function(expansion, s0, s1, key1, temp) PT = apply_table(temp, IP_inv) print("Plain text after decypting is:", PT)
def apply_table(inp, table): """ >>> apply_table("0123456789", list(range(10))) '9012345678' >>> apply_table("0123456789", list(range(9, -1, -1))) '8765432109' """ res = "" for i in table: res += inp[i - 1] return res def left_shift(data): """ >>> left_shift("0123456789") '1234567890' """ return data[1:] + data[0] def XOR(a, b): """ >>> XOR("01010101", "00001111") '01011010' """ res = "" for i in range(len(a)): if a[i] == b[i]: res += "0" else: res += "1" return res def apply_sbox(s, data): row = int("0b" + data[0] + data[-1], 2) col = int("0b" + data[1:3], 2) return bin(s[row][col])[2:] def function(expansion, s0, s1, key, message): left = message[:4] right = message[4:] temp = apply_table(right, expansion) temp = XOR(temp, key) l = apply_sbox(s0, temp[:4]) # noqa: E741 r = apply_sbox(s1, temp[4:]) l = "0" * (2 - len(l)) + l # noqa: E741 r = "0" * (2 - len(r)) + r temp = apply_table(l + r, p4_table) temp = XOR(left, temp) return temp + right if __name__ == "__main__": key = input("Enter 10 bit key: ") message = input("Enter 8 bit message: ") p8_table = [6, 3, 7, 4, 8, 5, 10, 9] p10_table = [3, 5, 2, 7, 4, 10, 1, 9, 8, 6] p4_table = [2, 4, 3, 1] IP = [2, 6, 3, 1, 4, 8, 5, 7] IP_inv = [4, 1, 3, 5, 7, 2, 8, 6] expansion = [4, 1, 2, 3, 2, 3, 4, 1] s0 = [[1, 0, 3, 2], [3, 2, 1, 0], [0, 2, 1, 3], [3, 1, 3, 2]] s1 = [[0, 1, 2, 3], [2, 0, 1, 3], [3, 0, 1, 0], [2, 1, 0, 3]] # key generation temp = apply_table(key, p10_table) left = temp[:5] right = temp[5:] left = left_shift(left) right = left_shift(right) key1 = apply_table(left + right, p8_table) left = left_shift(left) right = left_shift(right) left = left_shift(left) right = left_shift(right) key2 = apply_table(left + right, p8_table) # encryption temp = apply_table(message, IP) temp = function(expansion, s0, s1, key1, temp) temp = temp[4:] + temp[:4] temp = function(expansion, s0, s1, key2, temp) CT = apply_table(temp, IP_inv) print("Cipher text is:", CT) # decryption temp = apply_table(CT, IP) temp = function(expansion, s0, s1, key2, temp) temp = temp[4:] + temp[:4] temp = function(expansion, s0, s1, key1, temp) PT = apply_table(temp, IP_inv) print("Plain text after decypting is:", PT)
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" This is a python3 implementation of binary search tree using recursion To run tests: python -m unittest binary_search_tree_recursive.py To run an example: python binary_search_tree_recursive.py """ from __future__ import annotations import unittest from typing import Iterator class Node: def __init__(self, label: int, parent: Node | None) -> None: self.label = label self.parent = parent self.left: Node | None = None self.right: Node | None = None class BinarySearchTree: def __init__(self) -> None: self.root: Node | None = None def empty(self) -> None: """ Empties the tree >>> t = BinarySearchTree() >>> assert t.root is None >>> t.put(8) >>> assert t.root is not None """ self.root = None def is_empty(self) -> bool: """ Checks if the tree is empty >>> t = BinarySearchTree() >>> t.is_empty() True >>> t.put(8) >>> t.is_empty() False """ return self.root is None def put(self, label: int) -> None: """ Put a new node in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> assert t.root.parent is None >>> assert t.root.label == 8 >>> t.put(10) >>> assert t.root.right.parent == t.root >>> assert t.root.right.label == 10 >>> t.put(3) >>> assert t.root.left.parent == t.root >>> assert t.root.left.label == 3 """ self.root = self._put(self.root, label) def _put(self, node: Node | None, label: int, parent: Node | None = None) -> Node: if node is None: node = Node(label, parent) else: if label < node.label: node.left = self._put(node.left, label, node) elif label > node.label: node.right = self._put(node.right, label, node) else: raise Exception(f"Node with label {label} already exists") return node def search(self, label: int) -> Node: """ Searches a node in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> t.put(10) >>> node = t.search(8) >>> assert node.label == 8 >>> node = t.search(3) Traceback (most recent call last): ... Exception: Node with label 3 does not exist """ return self._search(self.root, label) def _search(self, node: Node | None, label: int) -> Node: if node is None: raise Exception(f"Node with label {label} does not exist") else: if label < node.label: node = self._search(node.left, label) elif label > node.label: node = self._search(node.right, label) return node def remove(self, label: int) -> None: """ Removes a node in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> t.put(10) >>> t.remove(8) >>> assert t.root.label == 10 >>> t.remove(3) Traceback (most recent call last): ... Exception: Node with label 3 does not exist """ node = self.search(label) if node.right and node.left: lowest_node = self._get_lowest_node(node.right) lowest_node.left = node.left lowest_node.right = node.right node.left.parent = lowest_node if node.right: node.right.parent = lowest_node self._reassign_nodes(node, lowest_node) elif not node.right and node.left: self._reassign_nodes(node, node.left) elif node.right and not node.left: self._reassign_nodes(node, node.right) else: self._reassign_nodes(node, None) def _reassign_nodes(self, node: Node, new_children: Node | None) -> None: if new_children: new_children.parent = node.parent if node.parent: if node.parent.right == node: node.parent.right = new_children else: node.parent.left = new_children else: self.root = new_children def _get_lowest_node(self, node: Node) -> Node: if node.left: lowest_node = self._get_lowest_node(node.left) else: lowest_node = node self._reassign_nodes(node, node.right) return lowest_node def exists(self, label: int) -> bool: """ Checks if a node exists in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> t.put(10) >>> t.exists(8) True >>> t.exists(3) False """ try: self.search(label) return True except Exception: return False def get_max_label(self) -> int: """ Gets the max label inserted in the tree >>> t = BinarySearchTree() >>> t.get_max_label() Traceback (most recent call last): ... Exception: Binary search tree is empty >>> t.put(8) >>> t.put(10) >>> t.get_max_label() 10 """ if self.root is None: raise Exception("Binary search tree is empty") node = self.root while node.right is not None: node = node.right return node.label def get_min_label(self) -> int: """ Gets the min label inserted in the tree >>> t = BinarySearchTree() >>> t.get_min_label() Traceback (most recent call last): ... Exception: Binary search tree is empty >>> t.put(8) >>> t.put(10) >>> t.get_min_label() 8 """ if self.root is None: raise Exception("Binary search tree is empty") node = self.root while node.left is not None: node = node.left return node.label def inorder_traversal(self) -> Iterator[Node]: """ Return the inorder traversal of the tree >>> t = BinarySearchTree() >>> [i.label for i in t.inorder_traversal()] [] >>> t.put(8) >>> t.put(10) >>> t.put(9) >>> [i.label for i in t.inorder_traversal()] [8, 9, 10] """ return self._inorder_traversal(self.root) def _inorder_traversal(self, node: Node | None) -> Iterator[Node]: if node is not None: yield from self._inorder_traversal(node.left) yield node yield from self._inorder_traversal(node.right) def preorder_traversal(self) -> Iterator[Node]: """ Return the preorder traversal of the tree >>> t = BinarySearchTree() >>> [i.label for i in t.preorder_traversal()] [] >>> t.put(8) >>> t.put(10) >>> t.put(9) >>> [i.label for i in t.preorder_traversal()] [8, 10, 9] """ return self._preorder_traversal(self.root) def _preorder_traversal(self, node: Node | None) -> Iterator[Node]: if node is not None: yield node yield from self._preorder_traversal(node.left) yield from self._preorder_traversal(node.right) class BinarySearchTreeTest(unittest.TestCase): @staticmethod def _get_binary_search_tree() -> BinarySearchTree: r""" 8 / \ 3 10 / \ \ 1 6 14 / \ / 4 7 13 \ 5 """ t = BinarySearchTree() t.put(8) t.put(3) t.put(6) t.put(1) t.put(10) t.put(14) t.put(13) t.put(4) t.put(7) t.put(5) return t def test_put(self) -> None: t = BinarySearchTree() assert t.is_empty() t.put(8) r""" 8 """ assert t.root is not None assert t.root.parent is None assert t.root.label == 8 t.put(10) r""" 8 \ 10 """ assert t.root.right is not None assert t.root.right.parent == t.root assert t.root.right.label == 10 t.put(3) r""" 8 / \ 3 10 """ assert t.root.left is not None assert t.root.left.parent == t.root assert t.root.left.label == 3 t.put(6) r""" 8 / \ 3 10 \ 6 """ assert t.root.left.right is not None assert t.root.left.right.parent == t.root.left assert t.root.left.right.label == 6 t.put(1) r""" 8 / \ 3 10 / \ 1 6 """ assert t.root.left.left is not None assert t.root.left.left.parent == t.root.left assert t.root.left.left.label == 1 with self.assertRaises(Exception): t.put(1) def test_search(self) -> None: t = self._get_binary_search_tree() node = t.search(6) assert node.label == 6 node = t.search(13) assert node.label == 13 with self.assertRaises(Exception): t.search(2) def test_remove(self) -> None: t = self._get_binary_search_tree() t.remove(13) r""" 8 / \ 3 10 / \ \ 1 6 14 / \ 4 7 \ 5 """ assert t.root is not None assert t.root.right is not None assert t.root.right.right is not None assert t.root.right.right.right is None assert t.root.right.right.left is None t.remove(7) r""" 8 / \ 3 10 / \ \ 1 6 14 / 4 \ 5 """ assert t.root.left is not None assert t.root.left.right is not None assert t.root.left.right.left is not None assert t.root.left.right.right is None assert t.root.left.right.left.label == 4 t.remove(6) r""" 8 / \ 3 10 / \ \ 1 4 14 \ 5 """ assert t.root.left.left is not None assert t.root.left.right.right is not None assert t.root.left.left.label == 1 assert t.root.left.right.label == 4 assert t.root.left.right.right.label == 5 assert t.root.left.right.left is None assert t.root.left.left.parent == t.root.left assert t.root.left.right.parent == t.root.left t.remove(3) r""" 8 / \ 4 10 / \ \ 1 5 14 """ assert t.root is not None assert t.root.left.label == 4 assert t.root.left.right.label == 5 assert t.root.left.left.label == 1 assert t.root.left.parent == t.root assert t.root.left.left.parent == t.root.left assert t.root.left.right.parent == t.root.left t.remove(4) r""" 8 / \ 5 10 / \ 1 14 """ assert t.root.left is not None assert t.root.left.left is not None assert t.root.left.label == 5 assert t.root.left.right is None assert t.root.left.left.label == 1 assert t.root.left.parent == t.root assert t.root.left.left.parent == t.root.left def test_remove_2(self) -> None: t = self._get_binary_search_tree() t.remove(3) r""" 8 / \ 4 10 / \ \ 1 6 14 / \ / 5 7 13 """ assert t.root is not None assert t.root.left is not None assert t.root.left.left is not None assert t.root.left.right is not None assert t.root.left.right.left is not None assert t.root.left.right.right is not None assert t.root.left.label == 4 assert t.root.left.right.label == 6 assert t.root.left.left.label == 1 assert t.root.left.right.right.label == 7 assert t.root.left.right.left.label == 5 assert t.root.left.parent == t.root assert t.root.left.right.parent == t.root.left assert t.root.left.left.parent == t.root.left assert t.root.left.right.left.parent == t.root.left.right def test_empty(self) -> None: t = self._get_binary_search_tree() t.empty() assert t.root is None def test_is_empty(self) -> None: t = self._get_binary_search_tree() assert not t.is_empty() t.empty() assert t.is_empty() def test_exists(self) -> None: t = self._get_binary_search_tree() assert t.exists(6) assert not t.exists(-1) def test_get_max_label(self) -> None: t = self._get_binary_search_tree() assert t.get_max_label() == 14 t.empty() with self.assertRaises(Exception): t.get_max_label() def test_get_min_label(self) -> None: t = self._get_binary_search_tree() assert t.get_min_label() == 1 t.empty() with self.assertRaises(Exception): t.get_min_label() def test_inorder_traversal(self) -> None: t = self._get_binary_search_tree() inorder_traversal_nodes = [i.label for i in t.inorder_traversal()] assert inorder_traversal_nodes == [1, 3, 4, 5, 6, 7, 8, 10, 13, 14] def test_preorder_traversal(self) -> None: t = self._get_binary_search_tree() preorder_traversal_nodes = [i.label for i in t.preorder_traversal()] assert preorder_traversal_nodes == [8, 3, 1, 6, 4, 5, 7, 10, 14, 13] def binary_search_tree_example() -> None: r""" Example 8 / \ 3 10 / \ \ 1 6 14 / \ / 4 7 13 \ 5 Example After Deletion 4 / \ 1 7 \ 5 """ t = BinarySearchTree() t.put(8) t.put(3) t.put(6) t.put(1) t.put(10) t.put(14) t.put(13) t.put(4) t.put(7) t.put(5) print( """ 8 / \\ 3 10 / \\ \\ 1 6 14 / \\ / 4 7 13 \\ 5 """ ) print("Label 6 exists:", t.exists(6)) print("Label 13 exists:", t.exists(13)) print("Label -1 exists:", t.exists(-1)) print("Label 12 exists:", t.exists(12)) # Prints all the elements of the list in inorder traversal inorder_traversal_nodes = [i.label for i in t.inorder_traversal()] print("Inorder traversal:", inorder_traversal_nodes) # Prints all the elements of the list in preorder traversal preorder_traversal_nodes = [i.label for i in t.preorder_traversal()] print("Preorder traversal:", preorder_traversal_nodes) print("Max. label:", t.get_max_label()) print("Min. label:", t.get_min_label()) # Delete elements print("\nDeleting elements 13, 10, 8, 3, 6, 14") print( """ 4 / \\ 1 7 \\ 5 """ ) t.remove(13) t.remove(10) t.remove(8) t.remove(3) t.remove(6) t.remove(14) # Prints all the elements of the list in inorder traversal after delete inorder_traversal_nodes = [i.label for i in t.inorder_traversal()] print("Inorder traversal after delete:", inorder_traversal_nodes) # Prints all the elements of the list in preorder traversal after delete preorder_traversal_nodes = [i.label for i in t.preorder_traversal()] print("Preorder traversal after delete:", preorder_traversal_nodes) print("Max. label:", t.get_max_label()) print("Min. label:", t.get_min_label()) if __name__ == "__main__": binary_search_tree_example()
""" This is a python3 implementation of binary search tree using recursion To run tests: python -m unittest binary_search_tree_recursive.py To run an example: python binary_search_tree_recursive.py """ from __future__ import annotations import unittest from typing import Iterator class Node: def __init__(self, label: int, parent: Node | None) -> None: self.label = label self.parent = parent self.left: Node | None = None self.right: Node | None = None class BinarySearchTree: def __init__(self) -> None: self.root: Node | None = None def empty(self) -> None: """ Empties the tree >>> t = BinarySearchTree() >>> assert t.root is None >>> t.put(8) >>> assert t.root is not None """ self.root = None def is_empty(self) -> bool: """ Checks if the tree is empty >>> t = BinarySearchTree() >>> t.is_empty() True >>> t.put(8) >>> t.is_empty() False """ return self.root is None def put(self, label: int) -> None: """ Put a new node in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> assert t.root.parent is None >>> assert t.root.label == 8 >>> t.put(10) >>> assert t.root.right.parent == t.root >>> assert t.root.right.label == 10 >>> t.put(3) >>> assert t.root.left.parent == t.root >>> assert t.root.left.label == 3 """ self.root = self._put(self.root, label) def _put(self, node: Node | None, label: int, parent: Node | None = None) -> Node: if node is None: node = Node(label, parent) else: if label < node.label: node.left = self._put(node.left, label, node) elif label > node.label: node.right = self._put(node.right, label, node) else: raise Exception(f"Node with label {label} already exists") return node def search(self, label: int) -> Node: """ Searches a node in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> t.put(10) >>> node = t.search(8) >>> assert node.label == 8 >>> node = t.search(3) Traceback (most recent call last): ... Exception: Node with label 3 does not exist """ return self._search(self.root, label) def _search(self, node: Node | None, label: int) -> Node: if node is None: raise Exception(f"Node with label {label} does not exist") else: if label < node.label: node = self._search(node.left, label) elif label > node.label: node = self._search(node.right, label) return node def remove(self, label: int) -> None: """ Removes a node in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> t.put(10) >>> t.remove(8) >>> assert t.root.label == 10 >>> t.remove(3) Traceback (most recent call last): ... Exception: Node with label 3 does not exist """ node = self.search(label) if node.right and node.left: lowest_node = self._get_lowest_node(node.right) lowest_node.left = node.left lowest_node.right = node.right node.left.parent = lowest_node if node.right: node.right.parent = lowest_node self._reassign_nodes(node, lowest_node) elif not node.right and node.left: self._reassign_nodes(node, node.left) elif node.right and not node.left: self._reassign_nodes(node, node.right) else: self._reassign_nodes(node, None) def _reassign_nodes(self, node: Node, new_children: Node | None) -> None: if new_children: new_children.parent = node.parent if node.parent: if node.parent.right == node: node.parent.right = new_children else: node.parent.left = new_children else: self.root = new_children def _get_lowest_node(self, node: Node) -> Node: if node.left: lowest_node = self._get_lowest_node(node.left) else: lowest_node = node self._reassign_nodes(node, node.right) return lowest_node def exists(self, label: int) -> bool: """ Checks if a node exists in the tree >>> t = BinarySearchTree() >>> t.put(8) >>> t.put(10) >>> t.exists(8) True >>> t.exists(3) False """ try: self.search(label) return True except Exception: return False def get_max_label(self) -> int: """ Gets the max label inserted in the tree >>> t = BinarySearchTree() >>> t.get_max_label() Traceback (most recent call last): ... Exception: Binary search tree is empty >>> t.put(8) >>> t.put(10) >>> t.get_max_label() 10 """ if self.root is None: raise Exception("Binary search tree is empty") node = self.root while node.right is not None: node = node.right return node.label def get_min_label(self) -> int: """ Gets the min label inserted in the tree >>> t = BinarySearchTree() >>> t.get_min_label() Traceback (most recent call last): ... Exception: Binary search tree is empty >>> t.put(8) >>> t.put(10) >>> t.get_min_label() 8 """ if self.root is None: raise Exception("Binary search tree is empty") node = self.root while node.left is not None: node = node.left return node.label def inorder_traversal(self) -> Iterator[Node]: """ Return the inorder traversal of the tree >>> t = BinarySearchTree() >>> [i.label for i in t.inorder_traversal()] [] >>> t.put(8) >>> t.put(10) >>> t.put(9) >>> [i.label for i in t.inorder_traversal()] [8, 9, 10] """ return self._inorder_traversal(self.root) def _inorder_traversal(self, node: Node | None) -> Iterator[Node]: if node is not None: yield from self._inorder_traversal(node.left) yield node yield from self._inorder_traversal(node.right) def preorder_traversal(self) -> Iterator[Node]: """ Return the preorder traversal of the tree >>> t = BinarySearchTree() >>> [i.label for i in t.preorder_traversal()] [] >>> t.put(8) >>> t.put(10) >>> t.put(9) >>> [i.label for i in t.preorder_traversal()] [8, 10, 9] """ return self._preorder_traversal(self.root) def _preorder_traversal(self, node: Node | None) -> Iterator[Node]: if node is not None: yield node yield from self._preorder_traversal(node.left) yield from self._preorder_traversal(node.right) class BinarySearchTreeTest(unittest.TestCase): @staticmethod def _get_binary_search_tree() -> BinarySearchTree: r""" 8 / \ 3 10 / \ \ 1 6 14 / \ / 4 7 13 \ 5 """ t = BinarySearchTree() t.put(8) t.put(3) t.put(6) t.put(1) t.put(10) t.put(14) t.put(13) t.put(4) t.put(7) t.put(5) return t def test_put(self) -> None: t = BinarySearchTree() assert t.is_empty() t.put(8) r""" 8 """ assert t.root is not None assert t.root.parent is None assert t.root.label == 8 t.put(10) r""" 8 \ 10 """ assert t.root.right is not None assert t.root.right.parent == t.root assert t.root.right.label == 10 t.put(3) r""" 8 / \ 3 10 """ assert t.root.left is not None assert t.root.left.parent == t.root assert t.root.left.label == 3 t.put(6) r""" 8 / \ 3 10 \ 6 """ assert t.root.left.right is not None assert t.root.left.right.parent == t.root.left assert t.root.left.right.label == 6 t.put(1) r""" 8 / \ 3 10 / \ 1 6 """ assert t.root.left.left is not None assert t.root.left.left.parent == t.root.left assert t.root.left.left.label == 1 with self.assertRaises(Exception): t.put(1) def test_search(self) -> None: t = self._get_binary_search_tree() node = t.search(6) assert node.label == 6 node = t.search(13) assert node.label == 13 with self.assertRaises(Exception): t.search(2) def test_remove(self) -> None: t = self._get_binary_search_tree() t.remove(13) r""" 8 / \ 3 10 / \ \ 1 6 14 / \ 4 7 \ 5 """ assert t.root is not None assert t.root.right is not None assert t.root.right.right is not None assert t.root.right.right.right is None assert t.root.right.right.left is None t.remove(7) r""" 8 / \ 3 10 / \ \ 1 6 14 / 4 \ 5 """ assert t.root.left is not None assert t.root.left.right is not None assert t.root.left.right.left is not None assert t.root.left.right.right is None assert t.root.left.right.left.label == 4 t.remove(6) r""" 8 / \ 3 10 / \ \ 1 4 14 \ 5 """ assert t.root.left.left is not None assert t.root.left.right.right is not None assert t.root.left.left.label == 1 assert t.root.left.right.label == 4 assert t.root.left.right.right.label == 5 assert t.root.left.right.left is None assert t.root.left.left.parent == t.root.left assert t.root.left.right.parent == t.root.left t.remove(3) r""" 8 / \ 4 10 / \ \ 1 5 14 """ assert t.root is not None assert t.root.left.label == 4 assert t.root.left.right.label == 5 assert t.root.left.left.label == 1 assert t.root.left.parent == t.root assert t.root.left.left.parent == t.root.left assert t.root.left.right.parent == t.root.left t.remove(4) r""" 8 / \ 5 10 / \ 1 14 """ assert t.root.left is not None assert t.root.left.left is not None assert t.root.left.label == 5 assert t.root.left.right is None assert t.root.left.left.label == 1 assert t.root.left.parent == t.root assert t.root.left.left.parent == t.root.left def test_remove_2(self) -> None: t = self._get_binary_search_tree() t.remove(3) r""" 8 / \ 4 10 / \ \ 1 6 14 / \ / 5 7 13 """ assert t.root is not None assert t.root.left is not None assert t.root.left.left is not None assert t.root.left.right is not None assert t.root.left.right.left is not None assert t.root.left.right.right is not None assert t.root.left.label == 4 assert t.root.left.right.label == 6 assert t.root.left.left.label == 1 assert t.root.left.right.right.label == 7 assert t.root.left.right.left.label == 5 assert t.root.left.parent == t.root assert t.root.left.right.parent == t.root.left assert t.root.left.left.parent == t.root.left assert t.root.left.right.left.parent == t.root.left.right def test_empty(self) -> None: t = self._get_binary_search_tree() t.empty() assert t.root is None def test_is_empty(self) -> None: t = self._get_binary_search_tree() assert not t.is_empty() t.empty() assert t.is_empty() def test_exists(self) -> None: t = self._get_binary_search_tree() assert t.exists(6) assert not t.exists(-1) def test_get_max_label(self) -> None: t = self._get_binary_search_tree() assert t.get_max_label() == 14 t.empty() with self.assertRaises(Exception): t.get_max_label() def test_get_min_label(self) -> None: t = self._get_binary_search_tree() assert t.get_min_label() == 1 t.empty() with self.assertRaises(Exception): t.get_min_label() def test_inorder_traversal(self) -> None: t = self._get_binary_search_tree() inorder_traversal_nodes = [i.label for i in t.inorder_traversal()] assert inorder_traversal_nodes == [1, 3, 4, 5, 6, 7, 8, 10, 13, 14] def test_preorder_traversal(self) -> None: t = self._get_binary_search_tree() preorder_traversal_nodes = [i.label for i in t.preorder_traversal()] assert preorder_traversal_nodes == [8, 3, 1, 6, 4, 5, 7, 10, 14, 13] def binary_search_tree_example() -> None: r""" Example 8 / \ 3 10 / \ \ 1 6 14 / \ / 4 7 13 \ 5 Example After Deletion 4 / \ 1 7 \ 5 """ t = BinarySearchTree() t.put(8) t.put(3) t.put(6) t.put(1) t.put(10) t.put(14) t.put(13) t.put(4) t.put(7) t.put(5) print( """ 8 / \\ 3 10 / \\ \\ 1 6 14 / \\ / 4 7 13 \\ 5 """ ) print("Label 6 exists:", t.exists(6)) print("Label 13 exists:", t.exists(13)) print("Label -1 exists:", t.exists(-1)) print("Label 12 exists:", t.exists(12)) # Prints all the elements of the list in inorder traversal inorder_traversal_nodes = [i.label for i in t.inorder_traversal()] print("Inorder traversal:", inorder_traversal_nodes) # Prints all the elements of the list in preorder traversal preorder_traversal_nodes = [i.label for i in t.preorder_traversal()] print("Preorder traversal:", preorder_traversal_nodes) print("Max. label:", t.get_max_label()) print("Min. label:", t.get_min_label()) # Delete elements print("\nDeleting elements 13, 10, 8, 3, 6, 14") print( """ 4 / \\ 1 7 \\ 5 """ ) t.remove(13) t.remove(10) t.remove(8) t.remove(3) t.remove(6) t.remove(14) # Prints all the elements of the list in inorder traversal after delete inorder_traversal_nodes = [i.label for i in t.inorder_traversal()] print("Inorder traversal after delete:", inorder_traversal_nodes) # Prints all the elements of the list in preorder traversal after delete preorder_traversal_nodes = [i.label for i in t.preorder_traversal()] print("Preorder traversal after delete:", preorder_traversal_nodes) print("Max. label:", t.get_max_label()) print("Min. label:", t.get_min_label()) if __name__ == "__main__": binary_search_tree_example()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Get the citation from google scholar using title and year of publication, and volume and pages of journal. """ import requests from bs4 import BeautifulSoup def get_citation(base_url: str, params: dict) -> str: """ Return the citation number. """ soup = BeautifulSoup(requests.get(base_url, params=params).content, "html.parser") div = soup.find("div", attrs={"class": "gs_ri"}) anchors = div.find("div", attrs={"class": "gs_fl"}).find_all("a") return anchors[2].get_text() if __name__ == "__main__": params = { "title": ( "Precisely geometry controlled microsupercapacitors for ultrahigh areal " "capacitance, volumetric capacitance, and energy density" ), "journal": "Chem. Mater.", "volume": 30, "pages": "3979-3990", "year": 2018, "hl": "en", } print(get_citation("http://scholar.google.com/scholar_lookup", params=params))
""" Get the citation from google scholar using title and year of publication, and volume and pages of journal. """ import requests from bs4 import BeautifulSoup def get_citation(base_url: str, params: dict) -> str: """ Return the citation number. """ soup = BeautifulSoup(requests.get(base_url, params=params).content, "html.parser") div = soup.find("div", attrs={"class": "gs_ri"}) anchors = div.find("div", attrs={"class": "gs_fl"}).find_all("a") return anchors[2].get_text() if __name__ == "__main__": params = { "title": ( "Precisely geometry controlled microsupercapacitors for ultrahigh areal " "capacitance, volumetric capacitance, and energy density" ), "journal": "Chem. Mater.", "volume": 30, "pages": "3979-3990", "year": 2018, "hl": "en", } print(get_citation("http://scholar.google.com/scholar_lookup", params=params))
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/env python3 """ The Bifid Cipher uses a Polybius Square to encipher a message in a way that makes it fairly difficult to decipher without knowing the secret. https://www.braingle.com/brainteasers/codes/bifid.php """ import numpy as np class BifidCipher: def __init__(self) -> None: SQUARE = [ ["a", "b", "c", "d", "e"], ["f", "g", "h", "i", "k"], ["l", "m", "n", "o", "p"], ["q", "r", "s", "t", "u"], ["v", "w", "x", "y", "z"], ] self.SQUARE = np.array(SQUARE) def letter_to_numbers(self, letter: str) -> np.ndarray: """ Return the pair of numbers that represents the given letter in the polybius square >>> np.array_equal(BifidCipher().letter_to_numbers('a'), [1,1]) True >>> np.array_equal(BifidCipher().letter_to_numbers('u'), [4,5]) True """ index1, index2 = np.where(self.SQUARE == letter) indexes = np.concatenate([index1 + 1, index2 + 1]) return indexes def numbers_to_letter(self, index1: int, index2: int) -> str: """ Return the letter corresponding to the position [index1, index2] in the polybius square >>> BifidCipher().numbers_to_letter(4, 5) == "u" True >>> BifidCipher().numbers_to_letter(1, 1) == "a" True """ letter = self.SQUARE[index1 - 1, index2 - 1] return letter def encode(self, message: str) -> str: """ Return the encoded version of message according to the polybius cipher >>> BifidCipher().encode('testmessage') == 'qtltbdxrxlk' True >>> BifidCipher().encode('Test Message') == 'qtltbdxrxlk' True >>> BifidCipher().encode('test j') == BifidCipher().encode('test i') True """ message = message.lower() message = message.replace(" ", "") message = message.replace("j", "i") first_step = np.empty((2, len(message))) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[0, letter_index] = numbers[0] first_step[1, letter_index] = numbers[1] second_step = first_step.reshape(2 * len(message)) encoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[numbers_index * 2]) index2 = int(second_step[(numbers_index * 2) + 1]) letter = self.numbers_to_letter(index1, index2) encoded_message = encoded_message + letter return encoded_message def decode(self, message: str) -> str: """ Return the decoded version of message according to the polybius cipher >>> BifidCipher().decode('qtltbdxrxlk') == 'testmessage' True """ message = message.lower() message.replace(" ", "") first_step = np.empty(2 * len(message)) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[letter_index * 2] = numbers[0] first_step[letter_index * 2 + 1] = numbers[1] second_step = first_step.reshape((2, len(message))) decoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[0, numbers_index]) index2 = int(second_step[1, numbers_index]) letter = self.numbers_to_letter(index1, index2) decoded_message = decoded_message + letter return decoded_message
#!/usr/bin/env python3 """ The Bifid Cipher uses a Polybius Square to encipher a message in a way that makes it fairly difficult to decipher without knowing the secret. https://www.braingle.com/brainteasers/codes/bifid.php """ import numpy as np class BifidCipher: def __init__(self) -> None: SQUARE = [ ["a", "b", "c", "d", "e"], ["f", "g", "h", "i", "k"], ["l", "m", "n", "o", "p"], ["q", "r", "s", "t", "u"], ["v", "w", "x", "y", "z"], ] self.SQUARE = np.array(SQUARE) def letter_to_numbers(self, letter: str) -> np.ndarray: """ Return the pair of numbers that represents the given letter in the polybius square >>> np.array_equal(BifidCipher().letter_to_numbers('a'), [1,1]) True >>> np.array_equal(BifidCipher().letter_to_numbers('u'), [4,5]) True """ index1, index2 = np.where(self.SQUARE == letter) indexes = np.concatenate([index1 + 1, index2 + 1]) return indexes def numbers_to_letter(self, index1: int, index2: int) -> str: """ Return the letter corresponding to the position [index1, index2] in the polybius square >>> BifidCipher().numbers_to_letter(4, 5) == "u" True >>> BifidCipher().numbers_to_letter(1, 1) == "a" True """ letter = self.SQUARE[index1 - 1, index2 - 1] return letter def encode(self, message: str) -> str: """ Return the encoded version of message according to the polybius cipher >>> BifidCipher().encode('testmessage') == 'qtltbdxrxlk' True >>> BifidCipher().encode('Test Message') == 'qtltbdxrxlk' True >>> BifidCipher().encode('test j') == BifidCipher().encode('test i') True """ message = message.lower() message = message.replace(" ", "") message = message.replace("j", "i") first_step = np.empty((2, len(message))) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[0, letter_index] = numbers[0] first_step[1, letter_index] = numbers[1] second_step = first_step.reshape(2 * len(message)) encoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[numbers_index * 2]) index2 = int(second_step[(numbers_index * 2) + 1]) letter = self.numbers_to_letter(index1, index2) encoded_message = encoded_message + letter return encoded_message def decode(self, message: str) -> str: """ Return the decoded version of message according to the polybius cipher >>> BifidCipher().decode('qtltbdxrxlk') == 'testmessage' True """ message = message.lower() message.replace(" ", "") first_step = np.empty(2 * len(message)) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[letter_index * 2] = numbers[0] first_step[letter_index * 2 + 1] = numbers[1] second_step = first_step.reshape((2, len(message))) decoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[0, numbers_index]) index2 = int(second_step[1, numbers_index]) letter = self.numbers_to_letter(index1, index2) decoded_message = decoded_message + letter return decoded_message
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# flake8: noqa """ This is pure Python implementation of tree traversal algorithms """ from __future__ import annotations import queue class TreeNode: def __init__(self, data): self.data = data self.right = None self.left = None def build_tree(): print("\n********Press N to stop entering at any point of time********\n") check = input("Enter the value of the root node: ").strip().lower() or "n" if check == "n": return None q: queue.Queue = queue.Queue() tree_node = TreeNode(int(check)) q.put(tree_node) while not q.empty(): node_found = q.get() msg = "Enter the left node of %s: " % node_found.data check = input(msg).strip().lower() or "n" if check == "n": return tree_node left_node = TreeNode(int(check)) node_found.left = left_node q.put(left_node) msg = "Enter the right node of %s: " % node_found.data check = input(msg).strip().lower() or "n" if check == "n": return tree_node right_node = TreeNode(int(check)) node_found.right = right_node q.put(right_node) def pre_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return print(node.data, end=",") pre_order(node.left) pre_order(node.right) def in_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return in_order(node.left) print(node.data, end=",") in_order(node.right) def post_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return post_order(node.left) post_order(node.right) print(node.data, end=",") def level_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order(root) 1,2,3,4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: q.put(node_dequeued.left) if node_dequeued.right: q.put(node_dequeued.right) def level_order_actual(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order_actual(root) 1, 2,3, 4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): list = [] while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: list.append(node_dequeued.left) if node_dequeued.right: list.append(node_dequeued.right) print() for node in list: q.put(node) # iteration version def pre_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order_iter(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: # start from root node, find its left child print(n.data, end=",") stack.append(n) n = n.left # end of while means current node doesn't have left child n = stack.pop() # start to traverse its right child n = n.right def in_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order_iter(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: stack.append(n) n = n.left n = stack.pop() print(n.data, end=",") n = n.right def post_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order_iter(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return stack1, stack2 = [], [] n = node stack1.append(n) while stack1: # to find the reversed order of post order, store it in stack2 n = stack1.pop() if n.left: stack1.append(n.left) if n.right: stack1.append(n.right) stack2.append(n) while stack2: # pop up from stack2 will be the post order print(stack2.pop().data, end=",") def prompt(s: str = "", width=50, char="*") -> str: if not s: return "\n" + width * char left, extra = divmod(width - len(s) - 2, 2) return f"{left * char} {s} {(left + extra) * char}" if __name__ == "__main__": import doctest doctest.testmod() print(prompt("Binary Tree Traversals")) node = build_tree() print(prompt("Pre Order Traversal")) pre_order(node) print(prompt() + "\n") print(prompt("In Order Traversal")) in_order(node) print(prompt() + "\n") print(prompt("Post Order Traversal")) post_order(node) print(prompt() + "\n") print(prompt("Level Order Traversal")) level_order(node) print(prompt() + "\n") print(prompt("Actual Level Order Traversal")) level_order_actual(node) print("*" * 50 + "\n") print(prompt("Pre Order Traversal - Iteration Version")) pre_order_iter(node) print(prompt() + "\n") print(prompt("In Order Traversal - Iteration Version")) in_order_iter(node) print(prompt() + "\n") print(prompt("Post Order Traversal - Iteration Version")) post_order_iter(node) print(prompt())
# flake8: noqa """ This is pure Python implementation of tree traversal algorithms """ from __future__ import annotations import queue class TreeNode: def __init__(self, data): self.data = data self.right = None self.left = None def build_tree(): print("\n********Press N to stop entering at any point of time********\n") check = input("Enter the value of the root node: ").strip().lower() or "n" if check == "n": return None q: queue.Queue = queue.Queue() tree_node = TreeNode(int(check)) q.put(tree_node) while not q.empty(): node_found = q.get() msg = "Enter the left node of %s: " % node_found.data check = input(msg).strip().lower() or "n" if check == "n": return tree_node left_node = TreeNode(int(check)) node_found.left = left_node q.put(left_node) msg = "Enter the right node of %s: " % node_found.data check = input(msg).strip().lower() or "n" if check == "n": return tree_node right_node = TreeNode(int(check)) node_found.right = right_node q.put(right_node) def pre_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return print(node.data, end=",") pre_order(node.left) pre_order(node.right) def in_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return in_order(node.left) print(node.data, end=",") in_order(node.right) def post_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return post_order(node.left) post_order(node.right) print(node.data, end=",") def level_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order(root) 1,2,3,4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: q.put(node_dequeued.left) if node_dequeued.right: q.put(node_dequeued.right) def level_order_actual(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order_actual(root) 1, 2,3, 4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): list = [] while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: list.append(node_dequeued.left) if node_dequeued.right: list.append(node_dequeued.right) print() for node in list: q.put(node) # iteration version def pre_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order_iter(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: # start from root node, find its left child print(n.data, end=",") stack.append(n) n = n.left # end of while means current node doesn't have left child n = stack.pop() # start to traverse its right child n = n.right def in_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order_iter(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: stack.append(n) n = n.left n = stack.pop() print(n.data, end=",") n = n.right def post_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order_iter(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return stack1, stack2 = [], [] n = node stack1.append(n) while stack1: # to find the reversed order of post order, store it in stack2 n = stack1.pop() if n.left: stack1.append(n.left) if n.right: stack1.append(n.right) stack2.append(n) while stack2: # pop up from stack2 will be the post order print(stack2.pop().data, end=",") def prompt(s: str = "", width=50, char="*") -> str: if not s: return "\n" + width * char left, extra = divmod(width - len(s) - 2, 2) return f"{left * char} {s} {(left + extra) * char}" if __name__ == "__main__": import doctest doctest.testmod() print(prompt("Binary Tree Traversals")) node = build_tree() print(prompt("Pre Order Traversal")) pre_order(node) print(prompt() + "\n") print(prompt("In Order Traversal")) in_order(node) print(prompt() + "\n") print(prompt("Post Order Traversal")) post_order(node) print(prompt() + "\n") print(prompt("Level Order Traversal")) level_order(node) print(prompt() + "\n") print(prompt("Actual Level Order Traversal")) level_order_actual(node) print("*" * 50 + "\n") print(prompt("Pre Order Traversal - Iteration Version")) pre_order_iter(node) print(prompt() + "\n") print(prompt("In Order Traversal - Iteration Version")) in_order_iter(node) print(prompt() + "\n") print(prompt("Post Order Traversal - Iteration Version")) post_order_iter(node) print(prompt())
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from PIL import Image """ Mean thresholding algorithm for image processing https://en.wikipedia.org/wiki/Thresholding_(image_processing) """ def mean_threshold(image: Image) -> Image: """ image: is a grayscale PIL image object """ height, width = image.size mean = 0 pixels = image.load() for i in range(width): for j in range(height): pixel = pixels[j, i] mean += pixel mean //= width * height for j in range(width): for i in range(height): pixels[i, j] = 255 if pixels[i, j] > mean else 0 return image if __name__ == "__main__": image = mean_threshold(Image.open("path_to_image").convert("L")) image.save("output_image_path")
from PIL import Image """ Mean thresholding algorithm for image processing https://en.wikipedia.org/wiki/Thresholding_(image_processing) """ def mean_threshold(image: Image) -> Image: """ image: is a grayscale PIL image object """ height, width = image.size mean = 0 pixels = image.load() for i in range(width): for j in range(height): pixel = pixels[j, i] mean += pixel mean //= width * height for j in range(width): for i in range(height): pixels[i, j] = 255 if pixels[i, j] > mean else 0 return image if __name__ == "__main__": image = mean_threshold(Image.open("path_to_image").convert("L")) image.save("output_image_path")
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Youtube Explanation: https://www.youtube.com/watch?v=lBRtnuxg-gU from __future__ import annotations def minimum_cost_path(matrix: list[list[int]]) -> int: """ Find the minimum cost traced by all possible paths from top left to bottom right in a given matrix >>> minimum_cost_path([[2, 1], [3, 1], [4, 2]]) 6 >>> minimum_cost_path([[2, 1, 4], [2, 1, 3], [3, 2, 1]]) 7 """ # preprocessing the first row for i in range(1, len(matrix[0])): matrix[0][i] += matrix[0][i - 1] # preprocessing the first column for i in range(1, len(matrix)): matrix[i][0] += matrix[i - 1][0] # updating the path cost for current position for i in range(1, len(matrix)): for j in range(1, len(matrix[0])): matrix[i][j] += min(matrix[i - 1][j], matrix[i][j - 1]) return matrix[-1][-1] if __name__ == "__main__": import doctest doctest.testmod()
# Youtube Explanation: https://www.youtube.com/watch?v=lBRtnuxg-gU from __future__ import annotations def minimum_cost_path(matrix: list[list[int]]) -> int: """ Find the minimum cost traced by all possible paths from top left to bottom right in a given matrix >>> minimum_cost_path([[2, 1], [3, 1], [4, 2]]) 6 >>> minimum_cost_path([[2, 1, 4], [2, 1, 3], [3, 2, 1]]) 7 """ # preprocessing the first row for i in range(1, len(matrix[0])): matrix[0][i] += matrix[0][i - 1] # preprocessing the first column for i in range(1, len(matrix)): matrix[i][0] += matrix[i - 1][0] # updating the path cost for current position for i in range(1, len(matrix)): for j in range(1, len(matrix[0])): matrix[i][j] += min(matrix[i - 1][j], matrix[i][j - 1]) return matrix[-1][-1] if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations from collections import deque class Automaton: def __init__(self, keywords: list[str]): self.adlist: list[dict] = list() self.adlist.append( {"value": "", "next_states": [], "fail_state": 0, "output": []} ) for keyword in keywords: self.add_keyword(keyword) self.set_fail_transitions() def find_next_state(self, current_state: int, char: str) -> int | None: for state in self.adlist[current_state]["next_states"]: if char == self.adlist[state]["value"]: return state return None def add_keyword(self, keyword: str) -> None: current_state = 0 for character in keyword: next_state = self.find_next_state(current_state, character) if next_state is None: self.adlist.append( { "value": character, "next_states": [], "fail_state": 0, "output": [], } ) self.adlist[current_state]["next_states"].append(len(self.adlist) - 1) current_state = len(self.adlist) - 1 else: current_state = next_state self.adlist[current_state]["output"].append(keyword) def set_fail_transitions(self) -> None: q: deque = deque() for node in self.adlist[0]["next_states"]: q.append(node) self.adlist[node]["fail_state"] = 0 while q: r = q.popleft() for child in self.adlist[r]["next_states"]: q.append(child) state = self.adlist[r]["fail_state"] while ( self.find_next_state(state, self.adlist[child]["value"]) is None and state != 0 ): state = self.adlist[state]["fail_state"] self.adlist[child]["fail_state"] = self.find_next_state( state, self.adlist[child]["value"] ) if self.adlist[child]["fail_state"] is None: self.adlist[child]["fail_state"] = 0 self.adlist[child]["output"] = ( self.adlist[child]["output"] + self.adlist[self.adlist[child]["fail_state"]]["output"] ) def search_in(self, string: str) -> dict[str, list[int]]: """ >>> A = Automaton(["what", "hat", "ver", "er"]) >>> A.search_in("whatever, err ... , wherever") {'what': [0], 'hat': [1], 'ver': [5, 25], 'er': [6, 10, 22, 26]} """ result: dict = ( dict() ) # returns a dict with keywords and list of its occurrences current_state = 0 for i in range(len(string)): while ( self.find_next_state(current_state, string[i]) is None and current_state != 0 ): current_state = self.adlist[current_state]["fail_state"] next_state = self.find_next_state(current_state, string[i]) if next_state is None: current_state = 0 else: current_state = next_state for key in self.adlist[current_state]["output"]: if not (key in result): result[key] = [] result[key].append(i - len(key) + 1) return result if __name__ == "__main__": import doctest doctest.testmod()
from __future__ import annotations from collections import deque class Automaton: def __init__(self, keywords: list[str]): self.adlist: list[dict] = list() self.adlist.append( {"value": "", "next_states": [], "fail_state": 0, "output": []} ) for keyword in keywords: self.add_keyword(keyword) self.set_fail_transitions() def find_next_state(self, current_state: int, char: str) -> int | None: for state in self.adlist[current_state]["next_states"]: if char == self.adlist[state]["value"]: return state return None def add_keyword(self, keyword: str) -> None: current_state = 0 for character in keyword: next_state = self.find_next_state(current_state, character) if next_state is None: self.adlist.append( { "value": character, "next_states": [], "fail_state": 0, "output": [], } ) self.adlist[current_state]["next_states"].append(len(self.adlist) - 1) current_state = len(self.adlist) - 1 else: current_state = next_state self.adlist[current_state]["output"].append(keyword) def set_fail_transitions(self) -> None: q: deque = deque() for node in self.adlist[0]["next_states"]: q.append(node) self.adlist[node]["fail_state"] = 0 while q: r = q.popleft() for child in self.adlist[r]["next_states"]: q.append(child) state = self.adlist[r]["fail_state"] while ( self.find_next_state(state, self.adlist[child]["value"]) is None and state != 0 ): state = self.adlist[state]["fail_state"] self.adlist[child]["fail_state"] = self.find_next_state( state, self.adlist[child]["value"] ) if self.adlist[child]["fail_state"] is None: self.adlist[child]["fail_state"] = 0 self.adlist[child]["output"] = ( self.adlist[child]["output"] + self.adlist[self.adlist[child]["fail_state"]]["output"] ) def search_in(self, string: str) -> dict[str, list[int]]: """ >>> A = Automaton(["what", "hat", "ver", "er"]) >>> A.search_in("whatever, err ... , wherever") {'what': [0], 'hat': [1], 'ver': [5, 25], 'er': [6, 10, 22, 26]} """ result: dict = ( dict() ) # returns a dict with keywords and list of its occurrences current_state = 0 for i in range(len(string)): while ( self.find_next_state(current_state, string[i]) is None and current_state != 0 ): current_state = self.adlist[current_state]["fail_state"] next_state = self.find_next_state(current_state, string[i]) if next_state is None: current_state = 0 else: current_state = next_state for key in self.adlist[current_state]["output"]: if not (key in result): result[key] = [] result[key].append(i - len(key) + 1) return result if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Implementation of a basic regression decision tree. Input data set: The input data set must be 1-dimensional with continuous labels. Output: The decision tree maps a real number input to a real number output. """ import numpy as np class Decision_Tree: def __init__(self, depth=5, min_leaf_size=5): self.depth = depth self.decision_boundary = 0 self.left = None self.right = None self.min_leaf_size = min_leaf_size self.prediction = None def mean_squared_error(self, labels, prediction): """ mean_squared_error: @param labels: a one dimensional numpy array @param prediction: a floating point value return value: mean_squared_error calculates the error if prediction is used to estimate the labels >>> tester = Decision_Tree() >>> test_labels = np.array([1,2,3,4,5,6,7,8,9,10]) >>> test_prediction = np.float(6) >>> tester.mean_squared_error(test_labels, test_prediction) == ( ... Test_Decision_Tree.helper_mean_squared_error_test(test_labels, ... test_prediction)) True >>> test_labels = np.array([1,2,3]) >>> test_prediction = np.float(2) >>> tester.mean_squared_error(test_labels, test_prediction) == ( ... Test_Decision_Tree.helper_mean_squared_error_test(test_labels, ... test_prediction)) True """ if labels.ndim != 1: print("Error: Input labels must be one dimensional") return np.mean((labels - prediction) ** 2) def train(self, X, y): """ train: @param X: a one dimensional numpy array @param y: a one dimensional numpy array. The contents of y are the labels for the corresponding X values train does not have a return value """ """ this section is to check that the inputs conform to our dimensionality constraints """ if X.ndim != 1: print("Error: Input data set must be one dimensional") return if len(X) != len(y): print("Error: X and y have different lengths") return if y.ndim != 1: print("Error: Data set labels must be one dimensional") return if len(X) < 2 * self.min_leaf_size: self.prediction = np.mean(y) return if self.depth == 1: self.prediction = np.mean(y) return best_split = 0 min_error = self.mean_squared_error(X, np.mean(y)) * 2 """ loop over all possible splits for the decision tree. find the best split. if no split exists that is less than 2 * error for the entire array then the data set is not split and the average for the entire array is used as the predictor """ for i in range(len(X)): if len(X[:i]) < self.min_leaf_size: continue elif len(X[i:]) < self.min_leaf_size: continue else: error_left = self.mean_squared_error(X[:i], np.mean(y[:i])) error_right = self.mean_squared_error(X[i:], np.mean(y[i:])) error = error_left + error_right if error < min_error: best_split = i min_error = error if best_split != 0: left_X = X[:best_split] left_y = y[:best_split] right_X = X[best_split:] right_y = y[best_split:] self.decision_boundary = X[best_split] self.left = Decision_Tree( depth=self.depth - 1, min_leaf_size=self.min_leaf_size ) self.right = Decision_Tree( depth=self.depth - 1, min_leaf_size=self.min_leaf_size ) self.left.train(left_X, left_y) self.right.train(right_X, right_y) else: self.prediction = np.mean(y) return def predict(self, x): """ predict: @param x: a floating point value to predict the label of the prediction function works by recursively calling the predict function of the appropriate subtrees based on the tree's decision boundary """ if self.prediction is not None: return self.prediction elif self.left or self.right is not None: if x >= self.decision_boundary: return self.right.predict(x) else: return self.left.predict(x) else: print("Error: Decision tree not yet trained") return None class Test_Decision_Tree: """Decision Tres test class""" @staticmethod def helper_mean_squared_error_test(labels, prediction): """ helper_mean_squared_error_test: @param labels: a one dimensional numpy array @param prediction: a floating point value return value: helper_mean_squared_error_test calculates the mean squared error """ squared_error_sum = np.float(0) for label in labels: squared_error_sum += (label - prediction) ** 2 return np.float(squared_error_sum / labels.size) def main(): """ In this demonstration we're generating a sample data set from the sin function in numpy. We then train a decision tree on the data set and use the decision tree to predict the label of 10 different test values. Then the mean squared error over this test is displayed. """ X = np.arange(-1.0, 1.0, 0.005) y = np.sin(X) tree = Decision_Tree(depth=10, min_leaf_size=10) tree.train(X, y) test_cases = (np.random.rand(10) * 2) - 1 predictions = np.array([tree.predict(x) for x in test_cases]) avg_error = np.mean((predictions - test_cases) ** 2) print("Test values: " + str(test_cases)) print("Predictions: " + str(predictions)) print("Average error: " + str(avg_error)) if __name__ == "__main__": main() import doctest doctest.testmod(name="mean_squarred_error", verbose=True)
""" Implementation of a basic regression decision tree. Input data set: The input data set must be 1-dimensional with continuous labels. Output: The decision tree maps a real number input to a real number output. """ import numpy as np class Decision_Tree: def __init__(self, depth=5, min_leaf_size=5): self.depth = depth self.decision_boundary = 0 self.left = None self.right = None self.min_leaf_size = min_leaf_size self.prediction = None def mean_squared_error(self, labels, prediction): """ mean_squared_error: @param labels: a one dimensional numpy array @param prediction: a floating point value return value: mean_squared_error calculates the error if prediction is used to estimate the labels >>> tester = Decision_Tree() >>> test_labels = np.array([1,2,3,4,5,6,7,8,9,10]) >>> test_prediction = np.float(6) >>> tester.mean_squared_error(test_labels, test_prediction) == ( ... Test_Decision_Tree.helper_mean_squared_error_test(test_labels, ... test_prediction)) True >>> test_labels = np.array([1,2,3]) >>> test_prediction = np.float(2) >>> tester.mean_squared_error(test_labels, test_prediction) == ( ... Test_Decision_Tree.helper_mean_squared_error_test(test_labels, ... test_prediction)) True """ if labels.ndim != 1: print("Error: Input labels must be one dimensional") return np.mean((labels - prediction) ** 2) def train(self, X, y): """ train: @param X: a one dimensional numpy array @param y: a one dimensional numpy array. The contents of y are the labels for the corresponding X values train does not have a return value """ """ this section is to check that the inputs conform to our dimensionality constraints """ if X.ndim != 1: print("Error: Input data set must be one dimensional") return if len(X) != len(y): print("Error: X and y have different lengths") return if y.ndim != 1: print("Error: Data set labels must be one dimensional") return if len(X) < 2 * self.min_leaf_size: self.prediction = np.mean(y) return if self.depth == 1: self.prediction = np.mean(y) return best_split = 0 min_error = self.mean_squared_error(X, np.mean(y)) * 2 """ loop over all possible splits for the decision tree. find the best split. if no split exists that is less than 2 * error for the entire array then the data set is not split and the average for the entire array is used as the predictor """ for i in range(len(X)): if len(X[:i]) < self.min_leaf_size: continue elif len(X[i:]) < self.min_leaf_size: continue else: error_left = self.mean_squared_error(X[:i], np.mean(y[:i])) error_right = self.mean_squared_error(X[i:], np.mean(y[i:])) error = error_left + error_right if error < min_error: best_split = i min_error = error if best_split != 0: left_X = X[:best_split] left_y = y[:best_split] right_X = X[best_split:] right_y = y[best_split:] self.decision_boundary = X[best_split] self.left = Decision_Tree( depth=self.depth - 1, min_leaf_size=self.min_leaf_size ) self.right = Decision_Tree( depth=self.depth - 1, min_leaf_size=self.min_leaf_size ) self.left.train(left_X, left_y) self.right.train(right_X, right_y) else: self.prediction = np.mean(y) return def predict(self, x): """ predict: @param x: a floating point value to predict the label of the prediction function works by recursively calling the predict function of the appropriate subtrees based on the tree's decision boundary """ if self.prediction is not None: return self.prediction elif self.left or self.right is not None: if x >= self.decision_boundary: return self.right.predict(x) else: return self.left.predict(x) else: print("Error: Decision tree not yet trained") return None class Test_Decision_Tree: """Decision Tres test class""" @staticmethod def helper_mean_squared_error_test(labels, prediction): """ helper_mean_squared_error_test: @param labels: a one dimensional numpy array @param prediction: a floating point value return value: helper_mean_squared_error_test calculates the mean squared error """ squared_error_sum = np.float(0) for label in labels: squared_error_sum += (label - prediction) ** 2 return np.float(squared_error_sum / labels.size) def main(): """ In this demonstration we're generating a sample data set from the sin function in numpy. We then train a decision tree on the data set and use the decision tree to predict the label of 10 different test values. Then the mean squared error over this test is displayed. """ X = np.arange(-1.0, 1.0, 0.005) y = np.sin(X) tree = Decision_Tree(depth=10, min_leaf_size=10) tree.train(X, y) test_cases = (np.random.rand(10) * 2) - 1 predictions = np.array([tree.predict(x) for x in test_cases]) avg_error = np.mean((predictions - test_cases) ** 2) print("Test values: " + str(test_cases)) print("Predictions: " + str(predictions)) print("Average error: " + str(avg_error)) if __name__ == "__main__": main() import doctest doctest.testmod(name="mean_squarred_error", verbose=True)
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
JFIF``ExifMM*;HasiJ >2424 2019:07:22 20:14:152019:07:22 20:14:15Has http://ns.adobe.com/xap/1.0/<?xpacket begin='' id='W5M0MpCehiHzreSzNTczkc9d'?> <x:xmpmeta xmlns:x="adobe:ns:meta/"><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"/><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:xmp="http://ns.adobe.com/xap/1.0/"><xmp:CreateDate>2019-07-22T20:14:15.236</xmp:CreateDate></rdf:Description><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"><dc:creator><rdf:Seq xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:li>Has</rdf:li></rdf:Seq> </dc:creator></rdf:Description></rdf:RDF></x:xmpmeta> <?xpacket end='w'?>C   '!%"."%()+,+ /3/*2'*+*C  ***************************************************e" }!1AQa"q2#BR$3br %&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz w!1AQaq"2B #3Rbr $4%&'()*56789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz ?F(((((((((((((((((((((((((((((((((((( Vk8w}=Y[Plgl17P=T_/\& iR7ԫhzzSys 72,PąFrIG'<Aً+,cy.*;j( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (25p7񎭧hzlڵkٺAeI՝MzF1W|TM-{-.RE+iݟZ_=~↱ҠK:2_yNup2Wddqֲ5-_ßm,W,d˨o\al1f5VEhf!Xfq uey皲EQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEVFVnz2裟ǥk]AgA߾Jڄ#Ra-H9S)t: Z:hG  WgҴ OMt$э]S3VYEl Ҍ!K$4ns=kG<EwxsRd[ѣ¡'5|kSbuv->/.7-6zVFǫ+}Ӛ\OP9o.e\F =vnO|Jp"fx<X<>jzuj}[&FqA uN#wQ(޼ӭq^5~{*P<%vFI9 UռUg#Oހ-A zWOWӣg "mOL]`uMEFc<zJp7ޣ-2G?pp3QgvzN>+eW+cPTP̪2{ő~!t4596@~uь3g?INxV5xrMJc y@[É9(}֖s,x;ђ1$ r)79\Fѥ?u۸(y"ڂj=2%):FyC ㏭i䊩jvzN׷ {t`FI~PE&gܩ, \rjQjir"iV1hHh\r:wI9s[O&p\S]Or9#ր% 4ns=ꎷ.^j- _eb|ώހ40NڝXՓ]Э58mweh#T9^:0$FGF 8b0=)sy4Îzp3Vh⍦}@EP:嬾mnOY\#F9B,SFX֣ bq˘ V884.E! u5pby?ҍ֢~.{9-'Ś>}ͩ06C0# FNx!_Vrޞ `q֍#9 dojYb*bOA@w <ѸgD3ƫ>fш>7nսoX4dH\#ӝ`NB@ 2zU;٧ݮZ%ްE ^ygfHRxm5_(i[_qwCОCہF犠ӖMP֭{PZ\'Sf4-qQEQEdkzrp:i<wz_ҸMsi*>k;{@M^) 4=,󶗪R|ɞ$_Q]]ұ5kt)<sU@q⬃g'LR=M#險y:?yOIWAP@9PEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEP\w5߁t+K'IJ zcEiJ~΢gs*7%Ko[ [[pH-<e،][c'L>^? *MϾgMCz =c5σiiyXpG&,z>)GX|ߔ qxU.tD _lmr~l 7JCAaK6A;xljg=OAXA &yjmk|E_ݾ&i/ׁY CĖ=jS<z.1@4xO>65)8[rJP9MgUm#ǺE_ͨ{ w#StoKoxWYVOVۄ_D,ͦY2;?-sbYmkP]>mu6:J*0yϧjMWV:Mnfx\a#JÚ6z}s[Og2`m]],eI崓ȵxڇ<%sE>ZJ1qJ^O=xedapP܋w8t6:-4kg6{,Z]whzwQ6|D 3i;|¿iXWm+k$𗍴Xu]J7ZwXWw˓UVoǠ^k7FTdJq ދzj"mGnn=ϴoNh?$7#q6xT~}uɨ:2Fgr9@^!໷Qmv#;H9֮_[Y^#}2j [83n.qjiڷD+ksb]뙢Ѯ%f H3󫟆~%=~9D4An{fSE[];n+@?6m&PRH.F{a8r?4DSh|VdLoeݎ$ >h1i2а ׽i^w^1Ӽ@|EkXbê]s@:5T'+·W,vuezWEW մۨOpm.!hx gp`'ޯxzhzڭ*Ҽ^bJGBsZs{-WND0"*O<{'}YѾ&^Pa\,SA9/i549I@&@:j76Ft^ !OoZܶeRMէL[x6#6^mŷ9ռ]7GºN V_dY̓c`NVѵ,eX%;~Un:?|PݼKn  shGwy3z<)6$3)_52;{R+&'fc1Z~^EO/-2&-7¾)7VP^X[\m+6 px8xNѼm< C,O?aqiWuG$VJiH1nxSH4Zi_=b%8| (ۓ۽p^.6ijj$bS0*d\V5]W;x]K\Gd}@5±-nm/Qkf'JO1xZ]GZZ" \9;*s89yA^Omȩi +*'ú&A$ޢ5靌G{> GW}o텵ʘРaNF@ڀ)k0j-i_Fei%B_0t㚊h@DI=Z~?1 Vd9<Nڨ|1{OZk7RZMa6=~#jvjRꮫ Yʱ\$`F5K&n-!2G'PJ85O֝d .Wr ?.tRno4$D 1=۞*ƓSQ +]EqQ[k.&w{%"s8=0+;G|3|6ɥ:<so##֝[x|OyKkl-Dجp~P;j~4ֵhֺIJ)eŰ3~Ǻyu}J(Cgz.)yVYgcsq}@(hwqmxnnKHJF-ibi427Bw|:ku6VHۮՔB]<Cqk{B=b>~YZ_ÿXxB] MnAghMU7rJf2/#%ֵCX"n^Ei.DZx_R^RilOJi-;IX鳤7^gߵ_爆jv7VeO#'%<#E׼uxWXZtKBş` <+k u[qyhS{W1t:bxU25v3`7sWY^ Կ6q"@0<Z5}S^6)6K2h j5|WKԉR~ +bŲJ0|ppw>:~=j+o-m[o 9lszu_ö:[ؠ9&`=R'IkVAqZjobҤM"!8χ-W#u8?ts?r"ޯɦ\kodL-6ā.$yP}DZ{yL#;|`gso,"8LpW.k.=:Uh:AvM#D}*ǁf?ľ7/4}GpCA@x?Eǃl4qzz&ǣ.7nkFGU9d8Hݻ ?7tz:]ӠdE4;h}8֯6lG,/7;,+Y;Inq^y|?񖍫jZ%SϜwsz_Y.`C3Ϊ8fTr$Sm~&֚no/.n]X>o) cx 617ZڞCw?!B@qWU/8Q 7|q@oj NJ}ӓS.GEe ^r젂@sSOjƕ-jVޏsV&kW~ Z}YW 5_^"|%?}ơpvQ0P䪖 $ccj>on!QCZۻ6N02?޽1G.;<o( p28ݝ<^X WePy-{Pt/YxOK1ݠ~d?P[[m lKq (25Qpzi=}?RT_/\&q֟)P"zsMjTt4\ʨOMx[\}u(mVH>IJB'L{Zr]eG(sSq`Րz~[Dp-[8e?znIq#5G̝z+IEZ\7g2Hq>𣬍o{o[ܠP{ 6us-j `:tV۞( ( ( ( ( ( ( ( ( (T*vEp54mfc,E3z@=Ar?5!ng.q$l̤zPZEPEPEPEPEPEW o^e#| 7PI`ˇ⎷tڕnTi#%vl͜(;-"u+y$8 *heII``ѷ9Q^mO[k >nYN(=ekIuk~!ݰOL +Vֵm'_%.>i^ 0Oe5#_h^ ߞH1l>uuo-ʣQK=RV|V  Vլԥb@kgt^ۖ*%/^#_^%.omu{g=7e@WccgD_kA>V, v;zxkKxPo"KOP1ȇ2[ofCO i%HCd!rAz0Ҹki:Yc#W.s>P[[Qy0pYHe @EyjcJ.mMpܹ{zuo\A/Ktdʴg=h״"'}̌OAW$9yMh|A5?R9!d%x[׾5ķ7m:?2I8`;s@Ds޼]=q pHz9#V?n_ŖN1QSy~tzumior`3~"31nyb{ ωu['OMs@[ <$ һk%uHz f붲\iQC[<mcBs=^U'+}[V|<Z]Y'{({{x>wtA).I\FtɠR;20^oqrql em0q<shӛa_xAӵh4Vۆ ,315׿Ͷ!E' rZlgw5Hխif{eP5*vր=.H Ay8ުh^#<GWCnжୌ⹝?Ʒ7~ þ$Kvܶ MI'kWx[-.Pڻa?3@EpZwYF[kڮIkZ-sא8M{uccœ޵o*Y5jv:5ϩ%3Y%8 "øe~+/օ;?$ss#zkBWԴ/8T dCX_t m4 ֲbq/ǿ5zg4l!g20|zoiڝ 6tga1kAs}kRiyYE9caj_oVm)-7-y|szMtn5 g\gCchzމq<idTc#y:k]4 As^enN<ٛhK NUMnmfM(^iB>&ut&4xoT3MS|g_|gK:ke,M RGҗEGumY%A@>|ZO<)f.n@n G./}(ipv:uvG @xF1QӢFG8:g7IY2!vH1^y|mcڃ[Y͞ )G5Xwc%w\)q4-⥮Oľ5͎STEi#n?3~&=Hz9U[))9;qw?3C@U--=/k'tM_LP'd9޵_Iy!M1E,vɴ;{MMc&ڱ-OhZ^%M0J~D$e?w_]iZaN0ErK6wJd8sY׈tY\ܶ5YD=OL 6)}UrVNO=:q<]kxgJkti&Dʙ?tVSVew}zh^)E,jċ ) K\ y Fyk⟎4ɐpzhݨ4"ƝiRmssKK8! ,])2㯿@t2nmg.6Cͱ.}ruSL-̋`ѮA@f|S :,wP OsҀ=bT9bq":W^=jJ=#Tzj4} c20#}E^s6Xxz;‡b397׎(<1X<SMV|:ypH#mEPF/\n:f~w)]޿/\g)J,<ObB$ JvZԼkp |y"`uҖ6VZİ DA:-\״$Z'6װ)Ot uf}3TA׳=t5q9涹^}> y84h:3麺}RBxy}EtcF@ ETk$3@ia'^цC)4j@r)h(((Z%2큟YCZ=YP:Z)dտ e4+m"ϻCB,>:Ӝ~5@*zq(A\f>W;amomgw<nWOWLTwIbV0̲T,ہ U<v3>H+~!x[= Yխ<PӬl>zOGGVޡdy{{<%&NVkKZ3W_pMS b? xwPBQ5 Kdq[F*gԒIzb>襬N(((((H `G@q⨼@?yrkM52ND9Rҡ5M{7.H quXcǥz#Fp/EH>ܾsۥkkCX> I!<GQ+՜Td͓(%еO▹{qkjZ_r]Wsޛ`j ȱ̆"dݸc4ӻ<ws@G6Ajpy_Ý:4#UY3!L*A^ׂzRPJ \\I}#6G'?YucLsuq\2̙>2{cnu8|#||R@^1'=Y jF.}41J=8iCo\x4 {M3Xa >&E򔁰d92º|Aix!4HRd߂9vl^ş$ińÚOk|5ZxnEuy7@ϭZW/E^]j21IjTS Y<'K.V{U=H[˦~0ֵ-;meUʨ МX!4'#&>˦ǣSUxY[ֺ?M[>` p^?ݙu9^>l7('ހ</^𾳬τf3Yjvu |ܞzGkm Á69x89Zo^(м>3(?sq֓W|W}AГ uc,Q$sڨ H[(>Yǵyw^,3j <b kirv<SSοf\y3TcU?$w$, x#+zW@uP-s^^%.,b{f#4_&u $}v.yW(vM{(o@7xa/m -*:2By# .q{a:,nN8`~n:zNOz3Ɨ%ѳ.w4! ̈ =k~h?12 r?{xGĚ}3LMC dWLX8O:׍5=4/ \i:Ȱ[9|)G(Xke 3L*a^Gxs_ԵǮx^[3<McĄaJ9>tPͦxÚW!s][&.n^#zxmignW@{/ ɨ[;[ދv2NsuWin+ 6ڊk|k'yssaoj y[nkB:~#CMM: Wy3,Tz(|-_ ԴϪiܵżN <k^"Ԯ_ypB$pI]m</^x)&.lU4GP~=+|Fp-DifCG54Px[W=;cHY rNpG9^*XtjYrMJ-jI^9Decp''{m/C+4-W){I%L<lX8ERK]iz] -YeJ ׷@7I/m->C!IoduN?Uj-<3=^=тncU+#2 pE{uF6z%ż{6[qV:_5jliI)")9=8{ E0<3@YxN+Ƌdz 8-2R3ZB¿mK=c硊16U}mWQIv]b㷹$QTU |Ru{:0Rqs3o|-fA<QyDFVNps+nm[XͧK)$gN>TsZ%'KS&|S3^Es> e<7kick[uc]5PEPF/\g)JQp?5JT?:- >dQ@P.E*K f&8*zT7iڲ}R|/0x>{O q8<޹ _wӵT>n>x_Pj+ZZͲ\Fd;Hh]<gS2ZuA#re{>o xWI5 jJhwȑ><=VR*^K4.uS]ZzYث,IYf>4qxQږ[o8;\jλ֬mb\ϗ}+48'D|83uL*7}XI'\|]me!oXfh?:h^tvYfafHrS;SŚ63jZ6v4Go@NJC3ʆ$OFV7kPoO.;$!w<ѼV:Xͥ\ks wwJRI3 -*,/Y[}2xAi?aSMWW*m9%)o?*t. m;M v|"*`q>+<GZ[3x1wQ gҹ]=@Mܸ5ðr7v mZ.$=@c^Y^xGU=J3$w6pA >]wy6#kˈCO'l:@ci$QOz& ๠4);ɼ5Kc0{`)72Z?TwoE5ī\QZT|b[$[mGӮBy ]Iz#R/5?HJK9>qqp')!UrN<W] zƓf&ēK̗-#]NU+Cc:<7ݮvKH)k3`((((('@ @Nfe{|n|SFö"t/[K:Gox Sq]kg\A}lw#i0s0yt>;mSO5+8gky$e]#{fGֿri6btPqGy]i_ ;msS7m|8'@wX.y#`+Vxo^l "wcEu%%6\/moy1 U~Sҳ£BACQ.= DIT +"F^>N ><_ 3,As& ~!-|;:+Ǩ&$*'tuuy{bw݌n˒\Ʃ3C\ȬcQy|}[ra*<Wg;R{YYfN<3c$Ҁ;xU_i>oqO-Ӣ@M떾WۭB#3;m2fKŌ ݌>\Wj1$EAT`P/'4,n᳛[ۼM杨p$q׮*O_o_4Kh1{(YxrlyQ\N87;C'J/_{%<$` hkB_H^{ʖRk9F*z_?>~Hd2wDR˙Sgҽi2V4!TK+7fXq(Szii:&o*in/3 rsVoww 4:JYPyyNŐJDe[% ]({Q4Wa>\b*𵏍&մ½q͵\62Nҽ^6Kd>BHä2X#p[xEt=OaH0/F8(>%xGuTHnݎ=:fXuygx՟~6G٬7!!D\c!foD-eV{3yd+f0s HPe7s' y?<ֻJYּG^!֙lS8%RN) =+K֞ Eᷖ&CF͆v~!/KkyR##+z}x:}R}+>5'۴o>^s)kXLo+IDi/r7Ӑy׭v!>5w>j_/q~2:kxgT KXj.<,0n8sW7Y$Si#d9+\·µ"qxO}=jb>:sˏQP./iˊY#wl־{;m#恫K<R'39?»;+(uόokA!2[hs:c 路A JWck鷦tw8#a>/|j<3/(ſtφ$m5F-?V$1[K#Ir`F lu?zIkq7۠3n֐s^/"*_FF7ɍ^iqhgpnt`ӚFޡCҖ(N(((((((((7P2:R@ 1=iQ@Q@Q@Q@Q@_ҸMsi+Ek{L֟)P"z>RՐQEQEqS8PAa{;P^[/X-Iv8 ڤ>jдFsR̂fO  p@FXX,qkEύ<:"6UZj-sg4Cf)cRHn>QNAԊ;3ռ1[xL,5!\YL$ 6~DZC[]gSÌǖ03XCZrjʗ, ťg{HpwZ?֡z7[ê]5F:( r vEռSi:Y UH4j0u$k[Uǖީ)* /|E=v&oo.&K̲sqE O薷Q=pwM͏w<UDQ倣с\YF񹳵u' >iZ i0ȉx\uҀ/eG$j> jׇbK-R+u3ZiL*r):[Sx9۰=+CE Gja릘3sӯ?A@w>Q׵CW\=]0JyPʱy6VM:}~ ttт@דS6;.?5L>\:/QU#IL.o\mi} (#4(K@ xÎk.?.AkZ"\M\68VV8;]BoxMV=]enP@>\EzzeR*z^8i@XdJ4_1Svczl)2y\}6G3>im?pSP5O]V IP(:rOZu# ((((((8:8(3UѬum@VA bm=kk綌\@P#rAPc g05wMѩESۃUx^KMѭg9]88yg<c@״䴎-XM3ֶ</V\+P$HCs_҃@8Zo~B{kqv_Zu jVwL7ƱDʠ=t>1۸c* yE  hP>Z߅`Pi7 y9Ԛ |pG;vr>UdRg`w ~5v*pC{qy<Nw([]F{}GPr~C6uΟlG2D^k(r t@'q[vejG>gJ噶,ⵈkYPdGjq\0 h!x!/2;DFӡK>s)2kwo< {P3m ì.=7jO1szFT =9 ӃS9ro1]]ݶm\-ĤdV5ᖟmk xV:Ү8,p:9M=(ɼ-54 KK{Uaq̲}!~= z~Z|zw I![󜟥yQ9oxfMPʩ"r?tA'oW85-Ra,96{5؄8*նe5ԩ YَEbI ϫI\hI.[z/G ŴZEtwJ9懭YxCմ-ku&u*v=*$z n_^Omխ- >5/W5ڮw޺8S8n>E4[Hl sM~WD[K)}9}]K4˻y$' "[k9%/]ΠҀ6sB$4-/$ɒh?Zc#NծK+ᾼ;&S7ֹ75]nKNA}HUaq][㨠 Җ J$$J}8bpE4?\ SSO $ɁOOEfImڍL2,ʡh]І &0p *+mBkhl(G['@]y:BD؏eEs~nδmCqD>?$ǨF)6:}Mgv1ߥvS<3Ұ<O}#)luwܾT[DfwlWƀ:*+O1G$ʚd?4fZ os6eە##9@Hi:RT:zPo=9ZA0=hJ)Pzupxgv{אF(xMO$đ7$~̧k6'ue 2}}hM[q45Zmy 6Q@G_xg]I eT۷?y['4 X%.E$y8Ym: h>(#_U W }3Zwz_ҸMsi*ZOQGҖuJZ((t:Z6Z ŞvDvde9R8SίK;{4[]f8EA2(7pyWe&[aV9y{"@~x6Ò]]B4F g|6UQM*Dic$XH@}s^ofvonamTJE6 $"(QHaEPEPEPEPEPJ.1򥢀 ?j??uB* ((((((((((((((((((l={P>(xHºgij4i{,`{vCS]2oQnv!>!MfQEq)n>^>_JK'¹isM,˱TcJ%Ok=^0&!1qڭh^ ޗ'w$ߚp-(ڴ5jxOÚN)<7QL*`?ixǒVT'yqv\ƶjz ~`cIzW?x7?6Wdr dRx)t?J{um}ȥ;Niּg/a|+:$| z `W~luWx>3׼yk-#<,,1c#փw.J.!q׷Zo9d`?G_ZxSB[bufZu!=gG_ZNMAm$dMXzvZl]P i3LxH&nxx!F=?_ jNZ&"w6\Bѭ"| &8RԴ_]|/*^*= ⶧6N#`Z_+.<;]Ni%%$8ú=k}29,* }HQ1&yxڶb[bK $q^rf`hS8\;W߉]j71%|/, q@/ E|:1^XΙ=<$K KmcDNI, sKyCiwKEu=ܮI'+Qֳ {υ:ͥ[Ėz|FC `qhׂ lehmM)$ H1yY*ˑyVk˧å 42^p'Lh2b!3^vx1z.NYfjH?w>֬K- ( rz瞕sq(m}+O39u7<wjVR+iȦ(Xcp9;K!#gV-_[Oݰm[PmNOֱk6t m=9I?Ě.ᯱY&vRM#ͼE-{Za Az臈ãN[-N[*֖āx=Z}jmQ1ޣ sּZ4K⦧|.JHX- %Ew|G[.+BѸ16^:&uYi cލыb-OMi<LOl׆^'|@$c[X&~؋#gps4/75 abY]W:ɏ[?uL7??WOgo慹5S]g_!€)K[t>pl<dk:.2Ֆʺ/UI$: +qK jyX+wV]*M&23pq@F.d6Z,QB9q!*ogGRM]K1RZFLr,rOz﴿ P]>-+mBm-f[~1}S/3e=Zm8hL[v s|AĞԴ.INӭf|A+IIҤ[$Mdx}J7!Ri.J6jk;xe4B;X`= @&h#4S ( Oνi>?|H1ZYX\ܷOBQkn O}އV0Tr#qU~&;zQkۀRT$nĞu@&i?m]16vG>U(Ek{L֟)]޿/\g)JGQGҖ((sSI府*2ǂ~U#9=o+st˟4.GP9 Z>d9(™=x]%F{N*J((((((((+$UW/a% QIP((((((((((((((((((W▊ۂz_ZQ˴_zEvmqԔP~Wq15%O”c#=PZB?3I%rctq Q@M&6*9cp3TP1KE!Qy.a0SQ@t1 w(SޗAϽ>aӊ6KAOEU+XvhI%@rMM}^%ۂ=#df1FKPdժ(DF׍vʃX,\ME1U1N-ݝ{f#$5%[# O(]'ޥMaqZE=h0A=GIE1b x=O(Ek{L֟)]޿/\g)JGQGҖ('p ē F|˫".8)';Aސ6ى2\Hz!}EGZ\Y;Ef=\p PCƻO8( REQEQEQEQEQEQEQER1“\w<ms᛽*JGXPYUni6ʹk(IcgOOO_Y i>!8n?=];S* rMXukEizIu)۽uT,ѧ[;[mg$ށшʱF u3a*sĬtQEQEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPF/\g)JQp?5JT?:- >dQ@ /8_Ś󵾋1[%Jg8*a@RFkz;KY&eX{$W5nRbG8Z`Ӵ-Sˊ;U1HW9PEPEPEPEPEPEPEPEP7*kʾ-U^ 2܌cU=+>-sY7+R)JztyycKdOJ~t]hq_.a 1fOB+<`(((~n&/`M \Am nQ[>+ Sv`j(C ( ( )7sҍ#Rnd)s@(=hܸF3Z( -<p= -PHh#֍:pPNM!`:Z)7Z7I=:ё@ E&Oz\E!`$z=JZ(Ѓ@ E&F@ E֐F(h)%>qZ)7ތu@FA(h=h6<zHXp( =@ E&H]@a@p2=zPFE&'Ro_Q4QFERP2H"MngҀ( T_/\& iRJ5ϽkOҔk=DuJZh#;".R汵N=Jfc82X^okBsl ;PQ^LV&g͹Ŕ|8ЧjuQ@Q@Q@Q@Q@Q@Q@.]>D4<~W/m"e t鞽GOZYj:&`u. ]\2XPkoo śip2@ 8bPdA@{kYIw{2C.~{.eE !\'/5/6jMLdVs?cڍXmom0}ͳ-u}v03;z+-ovcߊ<WwK;xZ{M6Eh ,^cIO L#O}|뜑u_SҿJ?li_ ocҸe?3_.a %Q]o_Ъa#Ш(()u-L\5~<I{-θ1AdvpI99>õt@a@',Hc>moF%5#>aYTJG>%ҍ6`|])#>Q1,? jߴzm[[;r$pܒI$WiZce)r̰1(F1wQEvQ@#T;N:_ILcj6GsS`(]⿇z[:Mn쯓3,D'$88Q؏^Yo]LoG{ӧzC6w^tWUյ4[iE.g+I4G(x~UD$g* ־ bK)I$pEyvztGt$7,ϸ1wo%y.=s,o&*[3I4VwBے}2*퇋4MU 7KʾkuF|8t-,ɉ$ >ӏjt5Sӵ;fAS1rFI<ezYkV4w2:[O5vmv屻;v}kqE=@<jti~xckj7.@+zJ{1pGMV~)~SYVz!KJ _;J +t|#Ɨ@ s`Ww&HfǷtJ{%Z\7Xn:qq[zmF;dNN~M\VcHAl]MΣgk³ws[iZ PZ^$cYw '^}?Z^r6p3I${ omٮn %ij4!ݝq܎9ط hӾ#鷿/.bk8_5_2V-sx-#Ğ5Śc1s2>թiM/qgk$'_¹ͦɭ|[T̒]1H'9=Fq=v]{Jң待,EdQxHVѯI9SZ V5K]*g9n# z/ Zxb/^]xz7sܒv㨠OҸAӼ?-'BuwV0I#H5~-&8-;#PLOԞ9S^_ښvpV ďSqP>3kg=_2g,q&G:/t)wE%J$N6qKq ߄4i˯ΘlL^0u-/SfxTPg99_^ I;#P6:WɤYmt wۃb<1{sMDm/|Kٚ;'Ph9%]{cހ;x{,:iÓrzh[xZz|&آGzdμ^\x:d : >lcַ|}si l9,W H㷑|C gr㹠J<u;TO a!r+OAt-{eDIr=kmeiuxh2A't2x )ahSZ1. yڀ=)W9G=Im%2P|{+bI=y5=;⏎mAtLIZ @? 5n|{{2j XF r{cYúVi-"2N`U?9vy[{2pl,M$cIm;OMiW꺳ڔDy8y`x^ϵt8sjil }xnWYj)7wd m:H?5PP#/qhuco-uԼXӘ*K9(k⦷9Usx_Wݣđl,XKޥ( Čɯ-rh:=~/p,f"G89۞þj_ǫ-KRh~^v.sN^-tY{mJ,`̗sSO[h`֭^K w*BƟ,bYtY3*`kbF,0r}j&5{[ iD< 9b|Mh-v=p\F?gּ?:͟o屖O$tU`wW;O c}ye7ϸ;Ddf:<X_{+Yf< qҼV4{>oplNBmR>+U:sWviVy%)%a~=?ƾҮNԵtH,W9wlHHVQZ4eʩLzgYSr|x֬If]]h=K%]h@,Z`}y/ [bX`h\H^@ωtM _jXnԱ!& kRj6MUw6v|Um FM<_C{4ߕ OzY<#_:uIɪV\] 7`ӊ-6$_r`oS85Qsᑪ.uVI|\tc9INjM-{ g*Ց֭#XL YDRXKi>Ф{oþ3мEzCi- o-K``p1޽3OKIhf@*S%'Ӧ5ɵ0~8LOz@v5>`46,w3z Oeɪ6cB|Y?xuS@/6}Yeq?wzxEK81%oIWG_t_ jRHt~:RWzE(PdW5Ο -lhY1>z퇍|;<c[LaO!#TOoQ{-+QxT;D9^iGnuGDX5jїsOvVou-HǶ*ˈ8U(25QpzOLk6>)]/\q1?IJ}oR,qKXm1sU+ySUXo)4yZ5[7sjuzΫ7b5hj6ۗF##ңgF_b,fD4 `+oRzqAY/az- -J%nj\KKiiVg) F0]Ox~3E?=`B~U0BrN8] JGYW%sԒk/_3-QEQEQEQEQEQEQEs ZkB*+3(#p=}h\uB(?aC:0=(.KқM?c?)G*zc 8ikoofm-n$WaH7>bJ/ xPu*uOsnE~#>-s^@^YoFk:ʯŏ)JҿJ?qK=:g<+? ]krK!߃ֿ1qUa~&,GQE0QEQEPyu/tm>wj9$d֎H\ #Qc%i+,4m/Jgm3MimX85vd}<3I'v4U* Q%c'n{sǞv c!04*qRQU$,ԣ丌^JD?3(qY2*Ih##]Tu&1q ۟ h4M̭ԑRhZVvih0 a@wx?*C* dzP{*PGAw9 <Jm6x*qSy= *?_j[[ImoZEo1̰p -X"Yiֶ/ŊPЀ9`=qQ.gkv"Qaډ$¨穨VҬRڦLsZFEV8UihCx?0L" ú;*Mۦ#w)AE\.f8+$fMbqZ*j;W#[~4O{9H0P7nrn3 b6*i=.;iI`b8YSh/N:sҀ!Y[oHG*L R@`Þxi5-}lpHM+M/f1Vu'T#ú*4 :p9kmmsiXuB#0ӌ|n,j|OoEo,%h/mɊApph&,関М_-s^zT~eBZܔ%5oI*Qk_G%P>6yA('MtN0qҀ%9Uot Kh,%pLyh)4uiW#w, eF=GJ[LM@#dqWx>(_:M8-څǝ\ c>эJksiw1zq-=YZipK W#;xn1֟#@-tm6-mA(U p9bt'ivp+n=kDHf.G=pH i|nN{mԹ3O?KgͰrD*5i㆟Uτ%Z![d2GL{z\sǭ+˂3T/4='RIivw2(>V4^/"oכ8uu)tfC3n2@hN9֤p*?sciz[;i}Ie3W~cz]8!v6Pn qPf:}L7nt:`})h.sfG!x It=.{Xd\/OU(.a^xM($JY\&_'eZjJPȫsWpOJER'@ot]7RhΣamws줲rZָǐ)Lzm*t;y{6܍)CV#O)^nL~6\fg@mr (XL[I'xaT-HսpOJuFEu2QnWlig=Y Eg[M5g ="=Cxn,|C(9<?F4z]4J۠9qV,=6k Dd14PFEW 53*\{U(i֑vI*=r;D·@E)D[d鑎zԢ*A;[ha "8S9Qz^+Ii0@O_ (25Qp?5JWw+>a?JREҙ3E/,j()>}kM[h:t $ќ<nYZmx}v~`Ls<2J.`cҠ+kD EQY((((((((((ok.`x?AQןHƷ\hcKFh.آȮpçj*'(iIB-_ j%|GGt sGrSq E56ْ]W]@\#?Mmᧇ/~5^s$VY#xbNPoX(+B((ϥ-#P뺦g@xgMRMjғ2v ;B 9$d0P8+7ƿ3Z^D,q,w5am>]fX<O~ꮲ4J4_׵X5M^[|ͤj'.Iw53R&j^_9H`#<st\u۳]Y&g^*}6 ;wUl6/8I:P]4x_ 7vzle0 7+kboOkCjJ&Vp18~X&6 fO6%S<#}Դ8oog\C@wɫL9?]nK3[!A.ux<.kZk3X[[%t7AӃ'6xsZJ~%YsSjkh^h7doX`H }OsZG/jg"X%Ki1pr~Hdž-|/;iys-ܼĬr~?L>K^mlG+[Ma<m "6;sd 4dҬn4"I#)hٗnsω֣^/^vWsks^i7z&q]iVf5͌#[S\{h丙n#R#1@> x{nk/j$3\[1' gc֋Ϭ?Eֺ|:^-ǵN_ /< &jiq d- ̀zǻSuI2hǺ4NC7ɆGu ~#[a{>wvdIR~kCҼSc KQ]Ned ~2Hcu Ns=2rcL_M:iqkf`H1y_ހ9Y>0:D~%MbCbxSSú_d ^߳[0I gϗzzWqkS3MۛAlH\8<#7,uY.nH.`+}Ҁy |FI#17Rq1`Ij?pvZoxvH$Cynɂ~=kI[_ 7.~j"ܪ O p4e.mu{Y+ (>R4MWK6Ku4Vi $QDI"Koͯ\&# 0rGl3RhItiz0ɸYKrPiM{.j/ cXdޛ|S]GZ]F_Y1x.@:p{WXX1Y8SF$c9=OsSOOԼW{} ĪΤaqmב@D?m9,*,<o\Ml'G/1ȓdz5[ͮiھttMʑ+wzv>mBv{Bs=ˢ O@i1S.agf p6.# \?,_ǭ_jJi>qT*:H[5/!MZ]:$9Ɇ$r:L2g[xM'/K n'4[T w[3F>ukWt_Aj6z旨my&W+YRI#ҽbIt=-6byFYakɬeU:tf;mF).xS'08U_İƗY| <c8BҵBV*n3yjȡyp?g:]_~f}F]*k}?<>{ʬyAxH>! jt/FQc sր9} To~jFR`+ 5VȯI\d@H/ sڷO L./!y: a񵯊e40ZBMjshXXOLmLatw#5xb+7t}yu=UAYе-Gq2<JmR͖*ϡOn.վq +(JzqsWro/mrdOQZ}5/~"5.[*P[<:eG ׁۥoFl˩=ئkh9#<yQ_ $`ӟZ%lv@u ݴ|됯9v2Y' -e{Jsse itdsk>t<J䫋okh$`qIqҲcUk/-WէJ#A7v]@i:O4Bi/DΗ,Ly_o^VtkZ׵϶RxglQo8e~lc]Vۋ k[wڼpVrcqU?\xgR.5ZHdD q@\$q榮Z^}ZUv .cϠ]QEQEQEQEQEQEQEQEQEQEQEQEQEdkJϽkOҔ^8Ҹ=pi^saiLT?zae$2"dm>l噧/UѨ6bhU'æ}* Y%Z4,<2rW88=*,QEQEQEQEQEQEQEQEQEQEQEeqLa IE&(KEFѱSv)PEPEPEPEPv/(:Qy-݋FQ@ ع) JFiPB(9 )hk㚊)qIP2Pp6vȐ D45oE93+E7b4ykiPm)h_N* 斊oA)h()h(Zw:hA(?}V^ym__(9z@ מ:ؼ׭-@F)hcZ8Q@Ts=A寥ZSkKq-@;REQEQEQEQEQEQEQEQEQEQEQEQEQEdk Hp{_ä{dl` 'm{"ݏҼVw\1lY*+&\ e$cQ& \/KQ<6qi]<G 8T_oմ+-[F+k1#+8f-\g3T*s}8 ;Kfh-,vǘ=W<U/ wcTFZKjy64!&* {UzQ@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@PN(xIo%A0Gǵrw-Kk*'d@k )ٽMᇩ5t='Lz4 $K a1b~ER<Ȋ9YQZQ@Q@Q@Q@Q@u;)/QLnsjʚynfb]zN=+;X@)h(((((((((((((((((((((Zj6]Em wʔz6M/OX< ⑎~d~~.4xL<}I$~&(a9=R]bVN i'f^VW+#b& 1YT1\ό+g kYLl1/ x.:iQT(WoxŽnt@mٺ^q _EX*XPI#$nZE۩ (;08r0 a::?u|Jl^Do,dIWҥs"6'Tڗ&zRs(Zxn/AE^ӵuKO[ C 8 Ҹ K ɚG(N?{ֻxQV=ȡL;c#tbiFG<D!7$ڷQEzAEAvk6$HPcҩn0rW̥UE٧3NjlIĎXHB! 946hA\**I6'2Ȥ9?*JVz(^:z[h ?x:5- 9o݀bqa7u;[_yK<{1=OESF6s>ZJ3$㠮BVOWn^߉U\;t{\5MnK!(6ZT)3ޠGh>[o3XǽQ:P $2[+^TӮ~xT%DfTیzqGq{ݿx4f+SoU׭thMsGJІh Iu922:u«4Q=P?Stۉ|.HϤ]16W1?׹xX1~oo1G82ܔ]FtڴwZiop7zy϶jeXɫի\uR#ѡ)J-hl^Do,dIgZxn/AE_Կw\_A5W]_XXLФ8 BwޞJ'$s,Bݚ3ӵuKO[ C 8 ҭTp{B>vrGZeklwÛsnQP] p&D<݊=[f׍}(#| >ETҽgN?+>j\J-IOL)ڝܐvx7$Y?ϲs=V-UЖcv, b{rMmJe J]"PCf~[, /,4RpMT5[mkNK-K0N:~Z&Z7uֹq +"G6;Nj}#:B,6?[Ė*.&kHSJ\1Z 5v´dq$ztX#GT~RF*.Q}'*y6QgS qz$3}tkQP_ v0v}&)Vo5Sg\}FlnOIS^jgEJyw_/:XN?^tZ~$3!F6è^U_%Z窔gdvДM9nQEflQE<M\=>kxn8vnyl5xp5Bp xϰ;GCZQ暹͏j>Z^ !LNb1=@}1ݻ?j<;c (v귛xuy΅>Vo3lĶ\]V)|ֽUtfOƽz$dyg{^ >Ѯ.HbRw8'52lV8( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9˻mcı4)cj!B{mY̏w:+-ќe#Ҝ*=dݺ8_Vk<rF$%o3p?J篕.hnD3FTTQ\ZV%YNyqNao-Lw[H';=xXX^G;\ihA.Y_î=xM-r³zk=v4^b)䷊h4,&M{SGiX(1:i\Ztd9[ZJ1,~`' c-kRl鼨نDjYj״ۋlo֟1#fH.{:]Z'wJEN)-7v̱%m [ /|(Y[oNAjwqq3[s9^'U%$] ,[d??(dn⨗<yw6eNmOcG\ *#Z/?Ш-.{')3P{wzf7e&(NnֶWhmF;ƸmK\&;1>B#fQRkFI9^z_3PjZ}&]VAing*TJFz2sӟYoS+@\آu#/h Bpm龖lpg79P?Z,KhP\7fgV8]]^Q?_&xHΟ׾0<Cm7iiʱ J ;VE9 i^!}C(H;[+BjV;-ܹVk6Rv2K 4m9j~!si4dyUUl$Hld|`` `9jUW)jk~&tpNU}ktҰPbuQݎbx:Qc8 uTRtO* k욵Cw~m-[f(R63`qG<~&ѷ'ir  K[h^)\+(«d*ut{KmyYtG '#>;WE?g5nyu/kǿ}|Pգ}RtF(?'=ilˮ%杮A=Ɵ>2A<StդKastelr@Zд$XV0Aފ)Itc98'w{[[o'#Z/?Ыvv|mvqXZEqʤ=ѥx'c~T{}m(E?Ozl<9@~٩U$wwTVU!kjNڧS2Yzf.V; rxǩx{ .%GXLRp{Oy[r^]iZɫz-kEZ-RYUwJ4}Zt,yoԚѢntV5rRn.UoOJ-3v8#t(<E?ympn#*UgW .XxMc.3i,?>g fڣ;Ƶ]f]Ӆȋ$R; Qd+c,2w,&PG2HCUJ&"y x2Q\`QEx[] 2[7,r"ޞX]Z=>cDW+zu+ D\IFm\?u31 j6ԯкH,j*ךi ˤ?j"JQvs/#Ku,sMRۺϟױ1šsSI<κׅGYF'JUGy!{ 1CTe(@˗Q¿s]TƇ|Ek2Mea xedk,{2</AǞ1dLx`q +|]$K]j7=<a]»ەU?VU:g2|h sMErx{4[x_Y9yk yK}bObX.6\1;IdQ@Q@Q@Q@Q@Q@Q@Q@Q@B#h2jAA+zW?[\4[0I$G|Bd?*W7foNF*+tKY^@CWWmuݺMm"*sUO܉Ӝ>%bj(3 ( ( ( ( ( ( ( ( ( ( ( ( ( (2|E c< Kk+<)CҜ fkƳZK,W?>#h3Mb.vF~ 5kY:!!l`0_zծǗORugM)Y+;d5=@+:pgg}WD_?oΤaItU+Kfw`;?>|Gb~@/a~'5&Un-vkoRyKX.vLH] tj:f.u ĞIcJZ,:[`27vt5x..a HȄD@ 涩J4vsQN+JZh)"5";U|ShEq#9Qp8,oQ]Z_Y:b;{u9yYO}9ƌ$j}AR&$M-0leU׽EꖚU!pRoY}M!ImS8Sbu/@52_]:'JV,]}iͤ,2uEW:ŘԾZoݳ}9یgq\P+gе 4r籮  `pGtL<a&MvcG:IM4z'њQEqZMktY6}]6{uw$,2$Kʑ띘EmW s4نK8i6'ns5J.twǛ_*XfyoGWc}oYGwe'HS!;xB6dV#ʹQE"kX9 F˞;NTyf#2p.minz3ohV 6"3߀۶67c[sF&HRcx?^[%^]V#NRQz2J Idު[_CR&HA!<?KV]k+|4ߏ2[8q|d5%veizCbxSWN7{M<֫cR)[M:gw[R4ˊ4.SEm|Ҥ d$ bu*O# ϯC\f<Ao$\DF)^ E1"J*$)T}Ko}G7tzm_Ovf^\lPMAj:ͩnx$>W+aX&kI5 p{ XO4QX{Kgb50鶧syZXO7M*Ͳ0;=p 3Wܠ1Xp_ui ow#ȖQK_QR5O (:((((wM-.\vdۃ܊J?]q \؊Ό=79]8Ej- 1a;z8u OYl?][Idz׏:k=xӜU 7z]RiW&N$,} Σ^xGG/MWH,yRY[:u0eNxB|"Jkbu,{nm#1I"_G/'grt0WkH_ʰ2I?.?=k?UtfKʤX`Rͭvk*M*ea1`3˽ ]u%@w,{s${\60SVRmFh&\['dU|teFmp/8byf2rrcp:`洿=Ф⿆Y巒m(:p*k0YZO ۬v.#xn Yqr3p4neKKA4cyU0gvHI;uzWb<@tV#a)T1>YL|NNGԃEMѬ+(e5yEq8((((((( y~#Iv4x bI5/W>LN s/bO)SLHux o<02#<-ĠʫW-,E(WhQVL%'}֜I\1]NpBYZXN< f$jD'bM,nk*p'xjsKHv# ( ( ( ( ( ( ( ( ( ( ( ( ( (8n>&hb(Y q%X}kT(@;+O3"#e S(+Y[rQʤ%2g6,rO;\㴟[&Ll gCRgP _ drH4Ǜ,t#\Ж=&# |k?suk(JvkѵuoԷx}L 4y;v~V甿'Yz|Ż,@p w'ҺJƬv^GN ec{7k맓zإiM$Zgn>T_ˬx^fbƿ*AVj5KݣJkO3u+mcoo.bXeJ#'姄[ Ё"R ǧEX( zrik_m "ݻ_jsS\J~ƍmJB7;P/M2P[*;yhU{?svks_S/EthJ4U2ye'@ E/!ޠg# m"q*p:+.hZzgQp[i_2^[cRQ{XԴM.fЭ}ˌæ;}1]9#qmc<$g8Uvf[xO.4oی ӪcTlK"UqqOs[u3z^wvM6ԼR?2TM(@T%.!)*]jT}O#_KZ ^XFn/b|o nO,u46;qxsQKK=RwgUOvU֑d6/`g|'8k`.nVuUw}nmԹ_\I}22[KHAcZ6k0 m*4kYsd~5GIG=QDf,3T ǺHO NHt·(E;{jysf\HCXA2KrCICеhQY{Z3Ώѽ܈+kwE(Գqo^_ d8=<><vUҝ8̛v"j{9$t܌+iz z"a$⳴h))ZRuV註Fx&*R\63K~r;MIY2%- bCEiAr?%L3} tu U`z3@T X|t}M{E67ֵ~ $gXz'm/?,S`n zZݢSJ>*mS5o[ã^x}gѧi&O)L]QWZ4V2cR.-=(((((= q:Z3edBB v\24>G{kVi܈sY2y#u(ЬR.لhHBwwml~?kRϤJ yKq%˴me0~Uڥz\ȗ`֬KM^]xX0<KQEuJ)ms~%˶Ic@VQ"Uמ8":1@la&<`eINZ!duH#C 4WqglFHu8Ai6WP}ռ Ch%m $3,H<@Ny)>N$Y`o*hg Ҝ@#r+7 s`{K V4rKwc'@w[z MttW9 އ=no&N?u34\'z?;GEsfhC_7@w[z MttW9 އ=no&N?u34\'z?:7ծ/+[;ǼSíRҼ;Emk܋u$p=- o#+ lfkz]}64#X4S#Bų$wDhR7!̝+հ/d$c'߉<IsK -/.QlѺEe$t7 t.d[x&ڮȾdqye$ӱa9en[Oʸ7AjwzEȯ$@,Ȅ m@Oº]Gu4b(~o6VlX5mI4A(]Fqۊ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( KI-dvL5ŲIv\bڄ{.Ybx#{Vs:]j:r46'8aJTD$0ҶSӥoS\UIJWAEUQ@5 u:3@]o _z[ jvp]#Aqq2>G0C ? JoMqp 9<nK'k|Uq Dw|CiK<ojomז0m'2@2#84hxgn'o/W/Oɣ]* Jfm՘c%=~|ßD2mE=7F˻>ԯ1,LIo> o__VW*Եeb!k vwcQzԐxGӯ-/.QW{{dUYYPkߑ@?8>—O л:Qfگ4,_gi;tnHnʜ]usxW7)(:{Me j[I Y+~?_ƒײqw57,To3RHאW)rRC(I5k[/{a܇O"YQJ6䞛x 蚗[A^."˷^6txW7*ŗ4Hm[hWoo+US@|=2ek7TCi8+Eyf8OנH_V|4A4RHbpsXIp2>t4]xo1_]?4N3@v){]ú|dQW%C1ˎ(jUvf'bNwE!T=WqVpFF*(PҦ;ּ9hz.^5_21Hr@$tJ͟@K[<#칔c޺O 4 ۠UX؟W" vMf⮝÷qib?t>nQ{Ya"Aw - K#R6srzVt2M#Z%(ۛ;*( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( Pv pQ@%m:0,77++}_㍻`ge'X:(:xf~;X.O\)oڬ!VXf7+ 4R)n~ra;9Uh7o?6O9o9+~>X:Q |Fv# esگxS3 B {f?q//|l0T0MGm$Ȩ *z(@e1v_p{Mý^0o pBݎX{֖[lv6*V $}9o6:4 f[.!QyX\msm,iᅫtt{|'52MwMty$mLĊ7!L!X*IQ!l5x;vy<m*O\J3Нh&AVw> 'P~O5@fcٵ$v/!ZF=Y֍((?DžR4lں. n q_\ .clpp3]Gw "I#B?QXZ$4]$\JmrVj-NZՒƁL8c^kҴ"JeI<YO5R> bAcf;rI20vUnfxS(9(((((((((((((((((((((((((((((((((((((((((((((sž$\*=~R/F W9o\iW xIELokRq6 O_0ѫgX [0X.w+)Q4g^:XkZmo"WVО]؍LjFÝI Ϩ[a=HLow w pLp_Us }BaVp,1誣%OqW~#Cy&A=&_=3rvA/i~ / &69ߜ"9-;]>L;x#HPEs7TKwRn-HcZFXq]xPҖ(((~\xyZEvWsg0I\%LѱV(FG^p+u]X{6F-Xb03Yx/JдY \\a95tѫ&r/ #\G=nz(ڠW*Š($((((((((((((((((((((((((((((((((((((((((((((((O~񥶻2Z]<'!0G t"o<|YRk s;9CK4`BWm?jyMV>HZ k ;(?0@OO񍦅_<+!Gue9Hu* ԃ:-uFRB YY\y'$LV(((((((((((((((((((((((((((((((((((((((((((((((((dZxė$Nk@e^B܀O€6(=kͼO\\x}VOuJG#pvV7}=J$!"ZQHX<EvV0VQGy xvh'8V_i~(`:QH (?>em3gMsr|KԷiB'猏Q]E*R1ӳEb&m_Plf;e܄+rARЊ+A EPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPnšw7),;Ƹ%K%uo-sGC˭@&>"*ƫ0 7uحӌ~=092{WWP6~voM.WsGuMMHVr*!`?J467J&(-7c,Oַ8Դ0Fom%FvB\t^>|#l&umGd(ҭèJO<*1^~ENJ:$MO5̶/_1aW$%POT4]*DӭZ‘.y''=jzM+3' NjzXI6Ӭx˂6BFH9R;rjDŽt@xf O`H4c!O\psW4sxGZs_%iМȅ];8㓪|Dѭu(3 ~gҵ51 ;!XUԑ k45G[\A ^MadF (0G5ѕ<:HW]\׌huxPD쪌 Jv80J3wii&fmLa(vH|_A]>^jCT+'Ku}?QmB<[`Fb '_\<䘮F~eEU#\ 7B|̹)#+ zf591 rK^ r5 ֆ)!':V&Kx/E-g?kU\%~2rע+0 ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9GM(iz#]A= } KCYkZ[%R_B-pnchj|Iw5֨iPtO칿8ta%֨6vr:W.8ÿ -7} 3;5 &MOZ[[HFz$ oD<_ZOlI%ʬNc+H4$ɚ|ѱT1]qnC#2 OYfy$x$TpJB;P+⋅Ś ۛvC/Ѐ>b5c,qeA?4nEejH~#LkE̍ R^&RylI57UNnx74A=HR ǀ?sZi^֚q Mg,[Ъ;{P6bcE>yTrۘ$]U#56_<9 y侷`n'T<䎛 ׳G"̊e="MMGSu c]~6VZX%ݤwm~89 X_JkKx,᷁6E G_}K-zSaOX^(XQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQER)ku$I8h]bQ6`90 zA˘H+}]kgd(Nq!wfKxK_5vLԭ+fG;VI$hqykw.g;@# Hn4L9-%XC2u_Di#/|Uٴȴ!c$t][8njkjӄ6n!xԵ0\F֓>춥GZ <Ig`An_izmk [oXI%j% ͦKcb~b L< ;[#Vs]]vJA %jIZ|1I=,2̸FV7't\bIj&rK8`֓f7F9|A>hbI wS(8ywdWhSdnV9$m)T7 ,æVkH0Ǔ<1&EQpOq+c6Uv v+__+<r1'$UdzIvR]~^xwUy4ֲDUNҳzsY=;[mByfg Å!D8<28v_ T^^:yQ|Vn$@ Fz.a{^ SuK[+m"\ͧIWI8A КxZ\L̑ Oc>h~bi9|F:_R; B2|yQ"s m=.9-M}$]D8bJ t)VU$O`s_%m_FU9Wg]%2)mX=xOQGXCYgtkor+g g5Y}s@׶lmn돘{2q¥>ww%K=ļ- ,xJ>zhY9R|ˆ# ZZgҴotؠHԄuS '*d\&CxV?zC޴u h̷:}fkx3av Btړ1-4³BΕmk,nL̾swu? _꺦 Kr+x$߼[aÐ8‹ h̷6{Q gO$9V("嶸PB!I' 8V48Elz SaMr;̖<`oEPH9VuQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEsKEfúO0m"*p?:~]y ݫf94=X՘0^vI)$:M6cLػ8ZI2=kו|;k7&ݮpnGkWCyj$Ub^O>5X 圆Cr:c*o&=4щ+PQTc ]u/j0dagc ˑAY`\d"JG6-ZTiQMуHX"d = n5&o o򷕸 \sF:VQo5E"y'pGZ(+ .40xz] 䙞 9,8*4 Xt`3:|.Ӽ_]ڝMz qjZpw*F dp󎵏OKE;zBAj#%ʕ9$u|3ɡok-Is4xy'&2O@< qc:J7W%X*.y@GpSsF;|1a̖wPW*fr\C|];ͼ1捯~C(N=ր3wp5lV 'BO*DEjHčO%bmn|ѓ+ҼZ|S O CM:7Ii"0o3h>\$>hc2x2m^&ۋH;w (-%ՋW!Ef/l?:fQ^kOk[XxiJZ@YԼ2͕ן uZFdna#zV<QܓDv/oo&n@<+S?ﯭzbF66waq(U%9I#m2ء%==}1~>E(((((((( xMqj(#i<S|;WSu$j֩ťi/,3̋M9f %w?Rx385.m[C6Vi(2ˎAƀ:" |j#ᫍkNvkS:1ZC{k{$)9 }+(((((((((((((((((((((((((((((((((((Aid#$+oX!qѻ05Eyp|2\xz$$+#|6&6LU-kNSn"GKߘq*9ù2}Ɲ0Z$Qidhmʷ b:M:*\\X;J|GpIEQg*(PH.ƨj+Vx)PIUԎ=A2%lm96װ4Ieq@hh_ 4+TTkDPyX׾2ŏvRWCuoM ȥ]d0#>ة_ &{ &HJ7@$V Y!Tq.ҦI2GLvƱFrN RG8khth{[hV"AeQGn'ܧ$k c[^,`J37*D+sWQOum` %("F5U(((((((((4=?Ě<^Y\ ē[ǡ(AǶqMm/t,cu)#e 2nO H=ZTP; *NyVi3O+L 'fQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQE
JFIF``ExifMM*;HasiJ >2424 2019:07:22 20:14:152019:07:22 20:14:15Has http://ns.adobe.com/xap/1.0/<?xpacket begin='' id='W5M0MpCehiHzreSzNTczkc9d'?> <x:xmpmeta xmlns:x="adobe:ns:meta/"><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"/><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:xmp="http://ns.adobe.com/xap/1.0/"><xmp:CreateDate>2019-07-22T20:14:15.236</xmp:CreateDate></rdf:Description><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"><dc:creator><rdf:Seq xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:li>Has</rdf:li></rdf:Seq> </dc:creator></rdf:Description></rdf:RDF></x:xmpmeta> <?xpacket end='w'?>C   '!%"."%()+,+ /3/*2'*+*C  ***************************************************e" }!1AQa"q2#BR$3br %&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz w!1AQaq"2B #3Rbr $4%&'()*56789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz ?F(((((((((((((((((((((((((((((((((((( Vk8w}=Y[Plgl17P=T_/\& iR7ԫhzzSys 72,PąFrIG'<Aً+,cy.*;j( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (25p7񎭧hzlڵkٺAeI՝MzF1W|TM-{-.RE+iݟZ_=~↱ҠK:2_yNup2Wddqֲ5-_ßm,W,d˨o\al1f5VEhf!Xfq uey皲EQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEVFVnz2裟ǥk]AgA߾Jڄ#Ra-H9S)t: Z:hG  WgҴ OMt$э]S3VYEl Ҍ!K$4ns=kG<EwxsRd[ѣ¡'5|kSbuv->/.7-6zVFǫ+}Ӛ\OP9o.e\F =vnO|Jp"fx<X<>jzuj}[&FqA uN#wQ(޼ӭq^5~{*P<%vFI9 UռUg#Oހ-A zWOWӣg "mOL]`uMEFc<zJp7ޣ-2G?pp3QgvzN>+eW+cPTP̪2{ő~!t4596@~uь3g?INxV5xrMJc y@[É9(}֖s,x;ђ1$ r)79\Fѥ?u۸(y"ڂj=2%):FyC ㏭i䊩jvzN׷ {t`FI~PE&gܩ, \rjQjir"iV1hHh\r:wI9s[O&p\S]Or9#ր% 4ns=ꎷ.^j- _eb|ώހ40NڝXՓ]Э58mweh#T9^:0$FGF 8b0=)sy4Îzp3Vh⍦}@EP:嬾mnOY\#F9B,SFX֣ bq˘ V884.E! u5pby?ҍ֢~.{9-'Ś>}ͩ06C0# FNx!_Vrޞ `q֍#9 dojYb*bOA@w <ѸgD3ƫ>fш>7nսoX4dH\#ӝ`NB@ 2zU;٧ݮZ%ްE ^ygfHRxm5_(i[_qwCОCہF犠ӖMP֭{PZ\'Sf4-qQEQEdkzrp:i<wz_ҸMsi*>k;{@M^) 4=,󶗪R|ɞ$_Q]]ұ5kt)<sU@q⬃g'LR=M#險y:?yOIWAP@9PEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEP\w5߁t+K'IJ zcEiJ~΢gs*7%Ko[ [[pH-<e،][c'L>^? *MϾgMCz =c5σiiyXpG&,z>)GX|ߔ qxU.tD _lmr~l 7JCAaK6A;xljg=OAXA &yjmk|E_ݾ&i/ׁY CĖ=jS<z.1@4xO>65)8[rJP9MgUm#ǺE_ͨ{ w#StoKoxWYVOVۄ_D,ͦY2;?-sbYmkP]>mu6:J*0yϧjMWV:Mnfx\a#JÚ6z}s[Og2`m]],eI崓ȵxڇ<%sE>ZJ1qJ^O=xedapP܋w8t6:-4kg6{,Z]whzwQ6|D 3i;|¿iXWm+k$𗍴Xu]J7ZwXWw˓UVoǠ^k7FTdJq ދzj"mGnn=ϴoNh?$7#q6xT~}uɨ:2Fgr9@^!໷Qmv#;H9֮_[Y^#}2j [83n.qjiڷD+ksb]뙢Ѯ%f H3󫟆~%=~9D4An{fSE[];n+@?6m&PRH.F{a8r?4DSh|VdLoeݎ$ >h1i2а ׽i^w^1Ӽ@|EkXbê]s@:5T'+·W,vuezWEW մۨOpm.!hx gp`'ޯxzhzڭ*Ҽ^bJGBsZs{-WND0"*O<{'}YѾ&^Pa\,SA9/i549I@&@:j76Ft^ !OoZܶeRMէL[x6#6^mŷ9ռ]7GºN V_dY̓c`NVѵ,eX%;~Un:?|PݼKn  shGwy3z<)6$3)_52;{R+&'fc1Z~^EO/-2&-7¾)7VP^X[\m+6 px8xNѼm< C,O?aqiWuG$VJiH1nxSH4Zi_=b%8| (ۓ۽p^.6ijj$bS0*d\V5]W;x]K\Gd}@5±-nm/Qkf'JO1xZ]GZZ" \9;*s89yA^Omȩi +*'ú&A$ޢ5靌G{> GW}o텵ʘРaNF@ڀ)k0j-i_Fei%B_0t㚊h@DI=Z~?1 Vd9<Nڨ|1{OZk7RZMa6=~#jvjRꮫ Yʱ\$`F5K&n-!2G'PJ85O֝d .Wr ?.tRno4$D 1=۞*ƓSQ +]EqQ[k.&w{%"s8=0+;G|3|6ɥ:<so##֝[x|OyKkl-Dجp~P;j~4ֵhֺIJ)eŰ3~Ǻyu}J(Cgz.)yVYgcsq}@(hwqmxnnKHJF-ibi427Bw|:ku6VHۮՔB]<Cqk{B=b>~YZ_ÿXxB] MnAghMU7rJf2/#%ֵCX"n^Ei.DZx_R^RilOJi-;IX鳤7^gߵ_爆jv7VeO#'%<#E׼uxWXZtKBş` <+k u[qyhS{W1t:bxU25v3`7sWY^ Կ6q"@0<Z5}S^6)6K2h j5|WKԉR~ +bŲJ0|ppw>:~=j+o-m[o 9lszu_ö:[ؠ9&`=R'IkVAqZjobҤM"!8χ-W#u8?ts?r"ޯɦ\kodL-6ā.$yP}DZ{yL#;|`gso,"8LpW.k.=:Uh:AvM#D}*ǁf?ľ7/4}GpCA@x?Eǃl4qzz&ǣ.7nkFGU9d8Hݻ ?7tz:]ӠdE4;h}8֯6lG,/7;,+Y;Inq^y|?񖍫jZ%SϜwsz_Y.`C3Ϊ8fTr$Sm~&֚no/.n]X>o) cx 617ZڞCw?!B@qWU/8Q 7|q@oj NJ}ӓS.GEe ^r젂@sSOjƕ-jVޏsV&kW~ Z}YW 5_^"|%?}ơpvQ0P䪖 $ccj>on!QCZۻ6N02?޽1G.;<o( p28ݝ<^X WePy-{Pt/YxOK1ݠ~d?P[[m lKq (25Qpzi=}?RT_/\&q֟)P"zsMjTt4\ʨOMx[\}u(mVH>IJB'L{Zr]eG(sSq`Րz~[Dp-[8e?znIq#5G̝z+IEZ\7g2Hq>𣬍o{o[ܠP{ 6us-j `:tV۞( ( ( ( ( ( ( ( ( (T*vEp54mfc,E3z@=Ar?5!ng.q$l̤zPZEPEPEPEPEPEW o^e#| 7PI`ˇ⎷tڕnTi#%vl͜(;-"u+y$8 *heII``ѷ9Q^mO[k >nYN(=ekIuk~!ݰOL +Vֵm'_%.>i^ 0Oe5#_h^ ߞH1l>uuo-ʣQK=RV|V  Vլԥb@kgt^ۖ*%/^#_^%.omu{g=7e@WccgD_kA>V, v;zxkKxPo"KOP1ȇ2[ofCO i%HCd!rAz0Ҹki:Yc#W.s>P[[Qy0pYHe @EyjcJ.mMpܹ{zuo\A/Ktdʴg=h״"'}̌OAW$9yMh|A5?R9!d%x[׾5ķ7m:?2I8`;s@Ds޼]=q pHz9#V?n_ŖN1QSy~tzumior`3~"31nyb{ ωu['OMs@[ <$ һk%uHz f붲\iQC[<mcBs=^U'+}[V|<Z]Y'{({{x>wtA).I\FtɠR;20^oqrql em0q<shӛa_xAӵh4Vۆ ,315׿Ͷ!E' rZlgw5Hխif{eP5*vր=.H Ay8ުh^#<GWCnжୌ⹝?Ʒ7~ þ$Kvܶ MI'kWx[-.Pڻa?3@EpZwYF[kڮIkZ-sא8M{uccœ޵o*Y5jv:5ϩ%3Y%8 "øe~+/օ;?$ss#zkBWԴ/8T dCX_t m4 ֲbq/ǿ5zg4l!g20|zoiڝ 6tga1kAs}kRiyYE9caj_oVm)-7-y|szMtn5 g\gCchzމq<idTc#y:k]4 As^enN<ٛhK NUMnmfM(^iB>&ut&4xoT3MS|g_|gK:ke,M RGҗEGumY%A@>|ZO<)f.n@n G./}(ipv:uvG @xF1QӢFG8:g7IY2!vH1^y|mcڃ[Y͞ )G5Xwc%w\)q4-⥮Oľ5͎STEi#n?3~&=Hz9U[))9;qw?3C@U--=/k'tM_LP'd9޵_Iy!M1E,vɴ;{MMc&ڱ-OhZ^%M0J~D$e?w_]iZaN0ErK6wJd8sY׈tY\ܶ5YD=OL 6)}UrVNO=:q<]kxgJkti&Dʙ?tVSVew}zh^)E,jċ ) K\ y Fyk⟎4ɐpzhݨ4"ƝiRmssKK8! ,])2㯿@t2nmg.6Cͱ.}ruSL-̋`ѮA@f|S :,wP OsҀ=bT9bq":W^=jJ=#Tzj4} c20#}E^s6Xxz;‡b397׎(<1X<SMV|:ypH#mEPF/\n:f~w)]޿/\g)J,<ObB$ JvZԼkp |y"`uҖ6VZİ DA:-\״$Z'6װ)Ot uf}3TA׳=t5q9涹^}> y84h:3麺}RBxy}EtcF@ ETk$3@ia'^цC)4j@r)h(((Z%2큟YCZ=YP:Z)dտ e4+m"ϻCB,>:Ӝ~5@*zq(A\f>W;amomgw<nWOWLTwIbV0̲T,ہ U<v3>H+~!x[= Yխ<PӬl>zOGGVޡdy{{<%&NVkKZ3W_pMS b? xwPBQ5 Kdq[F*gԒIzb>襬N(((((H `G@q⨼@?yrkM52ND9Rҡ5M{7.H quXcǥz#Fp/EH>ܾsۥkkCX> I!<GQ+՜Td͓(%еO▹{qkjZ_r]Wsޛ`j ȱ̆"dݸc4ӻ<ws@G6Ajpy_Ý:4#UY3!L*A^ׂzRPJ \\I}#6G'?YucLsuq\2̙>2{cnu8|#||R@^1'=Y jF.}41J=8iCo\x4 {M3Xa >&E򔁰d92º|Aix!4HRd߂9vl^ş$ińÚOk|5ZxnEuy7@ϭZW/E^]j21IjTS Y<'K.V{U=H[˦~0ֵ-;meUʨ МX!4'#&>˦ǣSUxY[ֺ?M[>` p^?ݙu9^>l7('ހ</^𾳬τf3Yjvu |ܞzGkm Á69x89Zo^(м>3(?sq֓W|W}AГ uc,Q$sڨ H[(>Yǵyw^,3j <b kirv<SSοf\y3TcU?$w$, x#+zW@uP-s^^%.,b{f#4_&u $}v.yW(vM{(o@7xa/m -*:2By# .q{a:,nN8`~n:zNOz3Ɨ%ѳ.w4! ̈ =k~h?12 r?{xGĚ}3LMC dWLX8O:׍5=4/ \i:Ȱ[9|)G(Xke 3L*a^Gxs_ԵǮx^[3<McĄaJ9>tPͦxÚW!s][&.n^#zxmignW@{/ ɨ[;[ދv2NsuWin+ 6ڊk|k'yssaoj y[nkB:~#CMM: Wy3,Tz(|-_ ԴϪiܵżN <k^"Ԯ_ypB$pI]m</^x)&.lU4GP~=+|Fp-DifCG54Px[W=;cHY rNpG9^*XtjYrMJ-jI^9Decp''{m/C+4-W){I%L<lX8ERK]iz] -YeJ ׷@7I/m->C!IoduN?Uj-<3=^=тncU+#2 pE{uF6z%ż{6[qV:_5jliI)")9=8{ E0<3@YxN+Ƌdz 8-2R3ZB¿mK=c硊16U}mWQIv]b㷹$QTU |Ru{:0Rqs3o|-fA<QyDFVNps+nm[XͧK)$gN>TsZ%'KS&|S3^Es> e<7kick[uc]5PEPF/\g)JQp?5JT?:- >dQ@P.E*K f&8*zT7iڲ}R|/0x>{O q8<޹ _wӵT>n>x_Pj+ZZͲ\Fd;Hh]<gS2ZuA#re{>o xWI5 jJhwȑ><=VR*^K4.uS]ZzYث,IYf>4qxQږ[o8;\jλ֬mb\ϗ}+48'D|83uL*7}XI'\|]me!oXfh?:h^tvYfafHrS;SŚ63jZ6v4Go@NJC3ʆ$OFV7kPoO.;$!w<ѼV:Xͥ\ks wwJRI3 -*,/Y[}2xAi?aSMWW*m9%)o?*t. m;M v|"*`q>+<GZ[3x1wQ gҹ]=@Mܸ5ðr7v mZ.$=@c^Y^xGU=J3$w6pA >]wy6#kˈCO'l:@ci$QOz& ๠4);ɼ5Kc0{`)72Z?TwoE5ī\QZT|b[$[mGӮBy ]Iz#R/5?HJK9>qqp')!UrN<W] zƓf&ēK̗-#]NU+Cc:<7ݮvKH)k3`((((('@ @Nfe{|n|SFö"t/[K:Gox Sq]kg\A}lw#i0s0yt>;mSO5+8gky$e]#{fGֿri6btPqGy]i_ ;msS7m|8'@wX.y#`+Vxo^l "wcEu%%6\/moy1 U~Sҳ£BACQ.= DIT +"F^>N ><_ 3,As& ~!-|;:+Ǩ&$*'tuuy{bw݌n˒\Ʃ3C\ȬcQy|}[ra*<Wg;R{YYfN<3c$Ҁ;xU_i>oqO-Ӣ@M떾WۭB#3;m2fKŌ ݌>\Wj1$EAT`P/'4,n᳛[ۼM杨p$q׮*O_o_4Kh1{(YxrlyQ\N87;C'J/_{%<$` hkB_H^{ʖRk9F*z_?>~Hd2wDR˙Sgҽi2V4!TK+7fXq(Szii:&o*in/3 rsVoww 4:JYPyyNŐJDe[% ]({Q4Wa>\b*𵏍&մ½q͵\62Nҽ^6Kd>BHä2X#p[xEt=OaH0/F8(>%xGuTHnݎ=:fXuygx՟~6G٬7!!D\c!foD-eV{3yd+f0s HPe7s' y?<ֻJYּG^!֙lS8%RN) =+K֞ Eᷖ&CF͆v~!/KkyR##+z}x:}R}+>5'۴o>^s)kXLo+IDi/r7Ӑy׭v!>5w>j_/q~2:kxgT KXj.<,0n8sW7Y$Si#d9+\·µ"qxO}=jb>:sˏQP./iˊY#wl־{;m#恫K<R'39?»;+(uόokA!2[hs:c 路A JWck鷦tw8#a>/|j<3/(ſtφ$m5F-?V$1[K#Ir`F lu?zIkq7۠3n֐s^/"*_FF7ɍ^iqhgpnt`ӚFޡCҖ(N(((((((((7P2:R@ 1=iQ@Q@Q@Q@Q@_ҸMsi+Ek{L֟)P"z>RՐQEQEqS8PAa{;P^[/X-Iv8 ڤ>jдFsR̂fO  p@FXX,qkEύ<:"6UZj-sg4Cf)cRHn>QNAԊ;3ռ1[xL,5!\YL$ 6~DZC[]gSÌǖ03XCZrjʗ, ťg{HpwZ?֡z7[ê]5F:( r vEռSi:Y UH4j0u$k[Uǖީ)* /|E=v&oo.&K̲sqE O薷Q=pwM͏w<UDQ倣с\YF񹳵u' >iZ i0ȉx\uҀ/eG$j> jׇbK-R+u3ZiL*r):[Sx9۰=+CE Gja릘3sӯ?A@w>Q׵CW\=]0JyPʱy6VM:}~ ttт@דS6;.?5L>\:/QU#IL.o\mi} (#4(K@ xÎk.?.AkZ"\M\68VV8;]BoxMV=]enP@>\EzzeR*z^8i@XdJ4_1Svczl)2y\}6G3>im?pSP5O]V IP(:rOZu# ((((((8:8(3UѬum@VA bm=kk綌\@P#rAPc g05wMѩESۃUx^KMѭg9]88yg<c@״䴎-XM3ֶ</V\+P$HCs_҃@8Zo~B{kqv_Zu jVwL7ƱDʠ=t>1۸c* yE  hP>Z߅`Pi7 y9Ԛ |pG;vr>UdRg`w ~5v*pC{qy<Nw([]F{}GPr~C6uΟlG2D^k(r t@'q[vejG>gJ噶,ⵈkYPdGjq\0 h!x!/2;DFӡK>s)2kwo< {P3m ì.=7jO1szFT =9 ӃS9ro1]]ݶm\-ĤdV5ᖟmk xV:Ү8,p:9M=(ɼ-54 KK{Uaq̲}!~= z~Z|zw I![󜟥yQ9oxfMPʩ"r?tA'oW85-Ra,96{5؄8*նe5ԩ YَEbI ϫI\hI.[z/G ŴZEtwJ9懭YxCմ-ku&u*v=*$z n_^Omխ- >5/W5ڮw޺8S8n>E4[Hl sM~WD[K)}9}]K4˻y$' "[k9%/]ΠҀ6sB$4-/$ɒh?Zc#NծK+ᾼ;&S7ֹ75]nKNA}HUaq][㨠 Җ J$$J}8bpE4?\ SSO $ɁOOEfImڍL2,ʡh]І &0p *+mBkhl(G['@]y:BD؏eEs~nδmCqD>?$ǨF)6:}Mgv1ߥvS<3Ұ<O}#)luwܾT[DfwlWƀ:*+O1G$ʚd?4fZ os6eە##9@Hi:RT:zPo=9ZA0=hJ)Pzupxgv{אF(xMO$đ7$~̧k6'ue 2}}hM[q45Zmy 6Q@G_xg]I eT۷?y['4 X%.E$y8Ym: h>(#_U W }3Zwz_ҸMsi*ZOQGҖuJZ((t:Z6Z ŞvDvde9R8SίK;{4[]f8EA2(7pyWe&[aV9y{"@~x6Ò]]B4F g|6UQM*Dic$XH@}s^ofvonamTJE6 $"(QHaEPEPEPEPEPJ.1򥢀 ?j??uB* ((((((((((((((((((l={P>(xHºgij4i{,`{vCS]2oQnv!>!MfQEq)n>^>_JK'¹isM,˱TcJ%Ok=^0&!1qڭh^ ޗ'w$ߚp-(ڴ5jxOÚN)<7QL*`?ixǒVT'yqv\ƶjz ~`cIzW?x7?6Wdr dRx)t?J{um}ȥ;Niּg/a|+:$| z `W~luWx>3׼yk-#<,,1c#փw.J.!q׷Zo9d`?G_ZxSB[bufZu!=gG_ZNMAm$dMXzvZl]P i3LxH&nxx!F=?_ jNZ&"w6\Bѭ"| &8RԴ_]|/*^*= ⶧6N#`Z_+.<;]Ni%%$8ú=k}29,* }HQ1&yxڶb[bK $q^rf`hS8\;W߉]j71%|/, q@/ E|:1^XΙ=<$K KmcDNI, sKyCiwKEu=ܮI'+Qֳ {υ:ͥ[Ėz|FC `qhׂ lehmM)$ H1yY*ˑyVk˧å 42^p'Lh2b!3^vx1z.NYfjH?w>֬K- ( rz瞕sq(m}+O39u7<wjVR+iȦ(Xcp9;K!#gV-_[Oݰm[PmNOֱk6t m=9I?Ě.ᯱY&vRM#ͼE-{Za Az臈ãN[-N[*֖āx=Z}jmQ1ޣ sּZ4K⦧|.JHX- %Ew|G[.+BѸ16^:&uYi cލыb-OMi<LOl׆^'|@$c[X&~؋#gps4/75 abY]W:ɏ[?uL7??WOgo慹5S]g_!€)K[t>pl<dk:.2Ֆʺ/UI$: +qK jyX+wV]*M&23pq@F.d6Z,QB9q!*ogGRM]K1RZFLr,rOz﴿ P]>-+mBm-f[~1}S/3e=Zm8hL[v s|AĞԴ.INӭf|A+IIҤ[$Mdx}J7!Ri.J6jk;xe4B;X`= @&h#4S ( Oνi>?|H1ZYX\ܷOBQkn O}އV0Tr#qU~&;zQkۀRT$nĞu@&i?m]16vG>U(Ek{L֟)]޿/\g)JGQGҖ((sSI府*2ǂ~U#9=o+st˟4.GP9 Z>d9(™=x]%F{N*J((((((((+$UW/a% QIP((((((((((((((((((W▊ۂz_ZQ˴_zEvmqԔP~Wq15%O”c#=PZB?3I%rctq Q@M&6*9cp3TP1KE!Qy.a0SQ@t1 w(SޗAϽ>aӊ6KAOEU+XvhI%@rMM}^%ۂ=#df1FKPdժ(DF׍vʃX,\ME1U1N-ݝ{f#$5%[# O(]'ޥMaqZE=h0A=GIE1b x=O(Ek{L֟)]޿/\g)JGQGҖ('p ē F|˫".8)';Aސ6ى2\Hz!}EGZ\Y;Ef=\p PCƻO8( REQEQEQEQEQEQEQER1“\w<ms᛽*JGXPYUni6ʹk(IcgOOO_Y i>!8n?=];S* rMXukEizIu)۽uT,ѧ[;[mg$ށшʱF u3a*sĬtQEQEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPF/\g)JQp?5JT?:- >dQ@ /8_Ś󵾋1[%Jg8*a@RFkz;KY&eX{$W5nRbG8Z`Ӵ-Sˊ;U1HW9PEPEPEPEPEPEPEPEP7*kʾ-U^ 2܌cU=+>-sY7+R)JztyycKdOJ~t]hq_.a 1fOB+<`(((~n&/`M \Am nQ[>+ Sv`j(C ( ( )7sҍ#Rnd)s@(=hܸF3Z( -<p= -PHh#֍:pPNM!`:Z)7Z7I=:ё@ E&Oz\E!`$z=JZ(Ѓ@ E&F@ E֐F(h)%>qZ)7ތu@FA(h=h6<zHXp( =@ E&H]@a@p2=zPFE&'Ro_Q4QFERP2H"MngҀ( T_/\& iRJ5ϽkOҔk=DuJZh#;".R汵N=Jfc82X^okBsl ;PQ^LV&g͹Ŕ|8ЧjuQ@Q@Q@Q@Q@Q@Q@.]>D4<~W/m"e t鞽GOZYj:&`u. ]\2XPkoo śip2@ 8bPdA@{kYIw{2C.~{.eE !\'/5/6jMLdVs?cڍXmom0}ͳ-u}v03;z+-ovcߊ<WwK;xZ{M6Eh ,^cIO L#O}|뜑u_SҿJ?li_ ocҸe?3_.a %Q]o_Ъa#Ш(()u-L\5~<I{-θ1AdvpI99>õt@a@',Hc>moF%5#>aYTJG>%ҍ6`|])#>Q1,? jߴzm[[;r$pܒI$WiZce)r̰1(F1wQEvQ@#T;N:_ILcj6GsS`(]⿇z[:Mn쯓3,D'$88Q؏^Yo]LoG{ӧzC6w^tWUյ4[iE.g+I4G(x~UD$g* ־ bK)I$pEyvztGt$7,ϸ1wo%y.=s,o&*[3I4VwBے}2*퇋4MU 7KʾkuF|8t-,ɉ$ >ӏjt5Sӵ;fAS1rFI<ezYkV4w2:[O5vmv屻;v}kqE=@<jti~xckj7.@+zJ{1pGMV~)~SYVz!KJ _;J +t|#Ɨ@ s`Ww&HfǷtJ{%Z\7Xn:qq[zmF;dNN~M\VcHAl]MΣgk³ws[iZ PZ^$cYw '^}?Z^r6p3I${ omٮn %ij4!ݝq܎9ط hӾ#鷿/.bk8_5_2V-sx-#Ğ5Śc1s2>թiM/qgk$'_¹ͦɭ|[T̒]1H'9=Fq=v]{Jң待,EdQxHVѯI9SZ V5K]*g9n# z/ Zxb/^]xz7sܒv㨠OҸAӼ?-'BuwV0I#H5~-&8-;#PLOԞ9S^_ښvpV ďSqP>3kg=_2g,q&G:/t)wE%J$N6qKq ߄4i˯ΘlL^0u-/SfxTPg99_^ I;#P6:WɤYmt wۃb<1{sMDm/|Kٚ;'Ph9%]{cހ;x{,:iÓrzh[xZz|&آGzdμ^\x:d : >lcַ|}si l9,W H㷑|C gr㹠J<u;TO a!r+OAt-{eDIr=kmeiuxh2A't2x )ahSZ1. yڀ=)W9G=Im%2P|{+bI=y5=;⏎mAtLIZ @? 5n|{{2j XF r{cYúVi-"2N`U?9vy[{2pl,M$cIm;OMiW꺳ڔDy8y`x^ϵt8sjil }xnWYj)7wd m:H?5PP#/qhuco-uԼXӘ*K9(k⦷9Usx_Wݣđl,XKޥ( Čɯ-rh:=~/p,f"G89۞þj_ǫ-KRh~^v.sN^-tY{mJ,`̗sSO[h`֭^K w*BƟ,bYtY3*`kbF,0r}j&5{[ iD< 9b|Mh-v=p\F?gּ?:͟o屖O$tU`wW;O c}ye7ϸ;Ddf:<X_{+Yf< qҼV4{>oplNBmR>+U:sWviVy%)%a~=?ƾҮNԵtH,W9wlHHVQZ4eʩLzgYSr|x֬If]]h=K%]h@,Z`}y/ [bX`h\H^@ωtM _jXnԱ!& kRj6MUw6v|Um FM<_C{4ߕ OzY<#_:uIɪV\] 7`ӊ-6$_r`oS85Qsᑪ.uVI|\tc9INjM-{ g*Ց֭#XL YDRXKi>Ф{oþ3мEzCi- o-K``p1޽3OKIhf@*S%'Ӧ5ɵ0~8LOz@v5>`46,w3z Oeɪ6cB|Y?xuS@/6}Yeq?wzxEK81%oIWG_t_ jRHt~:RWzE(PdW5Ο -lhY1>z퇍|;<c[LaO!#TOoQ{-+QxT;D9^iGnuGDX5jїsOvVou-HǶ*ˈ8U(25QpzOLk6>)]/\q1?IJ}oR,qKXm1sU+ySUXo)4yZ5[7sjuzΫ7b5hj6ۗF##ңgF_b,fD4 `+oRzqAY/az- -J%nj\KKiiVg) F0]Ox~3E?=`B~U0BrN8] JGYW%sԒk/_3-QEQEQEQEQEQEQEs ZkB*+3(#p=}h\uB(?aC:0=(.KқM?c?)G*zc 8ikoofm-n$WaH7>bJ/ xPu*uOsnE~#>-s^@^YoFk:ʯŏ)JҿJ?qK=:g<+? ]krK!߃ֿ1qUa~&,GQE0QEQEPyu/tm>wj9$d֎H\ #Qc%i+,4m/Jgm3MimX85vd}<3I'v4U* Q%c'n{sǞv c!04*qRQU$,ԣ丌^JD?3(qY2*Ih##]Tu&1q ۟ h4M̭ԑRhZVvih0 a@wx?*C* dzP{*PGAw9 <Jm6x*qSy= *?_j[[ImoZEo1̰p -X"Yiֶ/ŊPЀ9`=qQ.gkv"Qaډ$¨穨VҬRڦLsZFEV8UihCx?0L" ú;*Mۦ#w)AE\.f8+$fMbqZ*j;W#[~4O{9H0P7nrn3 b6*i=.;iI`b8YSh/N:sҀ!Y[oHG*L R@`Þxi5-}lpHM+M/f1Vu'T#ú*4 :p9kmmsiXuB#0ӌ|n,j|OoEo,%h/mɊApph&,関М_-s^zT~eBZܔ%5oI*Qk_G%P>6yA('MtN0qҀ%9Uot Kh,%pLyh)4uiW#w, eF=GJ[LM@#dqWx>(_:M8-څǝ\ c>эJksiw1zq-=YZipK W#;xn1֟#@-tm6-mA(U p9bt'ivp+n=kDHf.G=pH i|nN{mԹ3O?KgͰrD*5i㆟Uτ%Z![d2GL{z\sǭ+˂3T/4='RIivw2(>V4^/"oכ8uu)tfC3n2@hN9֤p*?sciz[;i}Ie3W~cz]8!v6Pn qPf:}L7nt:`})h.sfG!x It=.{Xd\/OU(.a^xM($JY\&_'eZjJPȫsWpOJER'@ot]7RhΣamws줲rZָǐ)Lzm*t;y{6܍)CV#O)^nL~6\fg@mr (XL[I'xaT-HսpOJuFEu2QnWlig=Y Eg[M5g ="=Cxn,|C(9<?F4z]4J۠9qV,=6k Dd14PFEW 53*\{U(i֑vI*=r;D·@E)D[d鑎zԢ*A;[ha "8S9Qz^+Ii0@O_ (25Qp?5JWw+>a?JREҙ3E/,j()>}kM[h:t $ќ<nYZmx}v~`Ls<2J.`cҠ+kD EQY((((((((((ok.`x?AQןHƷ\hcKFh.آȮpçj*'(iIB-_ j%|GGt sGrSq E56ْ]W]@\#?Mmᧇ/~5^s$VY#xbNPoX(+B((ϥ-#P뺦g@xgMRMjғ2v ;B 9$d0P8+7ƿ3Z^D,q,w5am>]fX<O~ꮲ4J4_׵X5M^[|ͤj'.Iw53R&j^_9H`#<st\u۳]Y&g^*}6 ;wUl6/8I:P]4x_ 7vzle0 7+kboOkCjJ&Vp18~X&6 fO6%S<#}Դ8oog\C@wɫL9?]nK3[!A.ux<.kZk3X[[%t7AӃ'6xsZJ~%YsSjkh^h7doX`H }OsZG/jg"X%Ki1pr~Hdž-|/;iys-ܼĬr~?L>K^mlG+[Ma<m "6;sd 4dҬn4"I#)hٗnsω֣^/^vWsks^i7z&q]iVf5͌#[S\{h丙n#R#1@> x{nk/j$3\[1' gc֋Ϭ?Eֺ|:^-ǵN_ /< &jiq d- ̀zǻSuI2hǺ4NC7ɆGu ~#[a{>wvdIR~kCҼSc KQ]Ned ~2Hcu Ns=2rcL_M:iqkf`H1y_ހ9Y>0:D~%MbCbxSSú_d ^߳[0I gϗzzWqkS3MۛAlH\8<#7,uY.nH.`+}Ҁy |FI#17Rq1`Ij?pvZoxvH$Cynɂ~=kI[_ 7.~j"ܪ O p4e.mu{Y+ (>R4MWK6Ku4Vi $QDI"Koͯ\&# 0rGl3RhItiz0ɸYKrPiM{.j/ cXdޛ|S]GZ]F_Y1x.@:p{WXX1Y8SF$c9=OsSOOԼW{} ĪΤaqmב@D?m9,*,<o\Ml'G/1ȓdz5[ͮiھttMʑ+wzv>mBv{Bs=ˢ O@i1S.agf p6.# \?,_ǭ_jJi>qT*:H[5/!MZ]:$9Ɇ$r:L2g[xM'/K n'4[T w[3F>ukWt_Aj6z旨my&W+YRI#ҽbIt=-6byFYakɬeU:tf;mF).xS'08U_İƗY| <c8BҵBV*n3yjȡyp?g:]_~f}F]*k}?<>{ʬyAxH>! jt/FQc sր9} To~jFR`+ 5VȯI\d@H/ sڷO L./!y: a񵯊e40ZBMjshXXOLmLatw#5xb+7t}yu=UAYе-Gq2<JmR͖*ϡOn.վq +(JzqsWro/mrdOQZ}5/~"5.[*P[<:eG ׁۥoFl˩=ئkh9#<yQ_ $`ӟZ%lv@u ݴ|됯9v2Y' -e{Jsse itdsk>t<J䫋okh$`qIqҲcUk/-WէJ#A7v]@i:O4Bi/DΗ,Ly_o^VtkZ׵϶RxglQo8e~lc]Vۋ k[wڼpVrcqU?\xgR.5ZHdD q@\$q榮Z^}ZUv .cϠ]QEQEQEQEQEQEQEQEQEQEQEQEQEdkJϽkOҔ^8Ҹ=pi^saiLT?zae$2"dm>l噧/UѨ6bhU'æ}* Y%Z4,<2rW88=*,QEQEQEQEQEQEQEQEQEQEQEeqLa IE&(KEFѱSv)PEPEPEPEPv/(:Qy-݋FQ@ ع) JFiPB(9 )hk㚊)qIP2Pp6vȐ D45oE93+E7b4ykiPm)h_N* 斊oA)h()h(Zw:hA(?}V^ym__(9z@ מ:ؼ׭-@F)hcZ8Q@Ts=A寥ZSkKq-@;REQEQEQEQEQEQEQEQEQEQEQEQEQEdk Hp{_ä{dl` 'm{"ݏҼVw\1lY*+&\ e$cQ& \/KQ<6qi]<G 8T_oմ+-[F+k1#+8f-\g3T*s}8 ;Kfh-,vǘ=W<U/ wcTFZKjy64!&* {UzQ@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@PN(xIo%A0Gǵrw-Kk*'d@k )ٽMᇩ5t='Lz4 $K a1b~ER<Ȋ9YQZQ@Q@Q@Q@Q@u;)/QLnsjʚynfb]zN=+;X@)h(((((((((((((((((((((Zj6]Em wʔz6M/OX< ⑎~d~~.4xL<}I$~&(a9=R]bVN i'f^VW+#b& 1YT1\ό+g kYLl1/ x.:iQT(WoxŽnt@mٺ^q _EX*XPI#$nZE۩ (;08r0 a::?u|Jl^Do,dIWҥs"6'Tڗ&zRs(Zxn/AE^ӵuKO[ C 8 Ҹ K ɚG(N?{ֻxQV=ȡL;c#tbiFG<D!7$ڷQEzAEAvk6$HPcҩn0rW̥UE٧3NjlIĎXHB! 946hA\**I6'2Ȥ9?*JVz(^:z[h ?x:5- 9o݀bqa7u;[_yK<{1=OESF6s>ZJ3$㠮BVOWn^߉U\;t{\5MnK!(6ZT)3ޠGh>[o3XǽQ:P $2[+^TӮ~xT%DfTیzqGq{ݿx4f+SoU׭thMsGJІh Iu922:u«4Q=P?Stۉ|.HϤ]16W1?׹xX1~oo1G82ܔ]FtڴwZiop7zy϶jeXɫի\uR#ѡ)J-hl^Do,dIgZxn/AE_Կw\_A5W]_XXLФ8 BwޞJ'$s,Bݚ3ӵuKO[ C 8 ҭTp{B>vrGZeklwÛsnQP] p&D<݊=[f׍}(#| >ETҽgN?+>j\J-IOL)ڝܐvx7$Y?ϲs=V-UЖcv, b{rMmJe J]"PCf~[, /,4RpMT5[mkNK-K0N:~Z&Z7uֹq +"G6;Nj}#:B,6?[Ė*.&kHSJ\1Z 5v´dq$ztX#GT~RF*.Q}'*y6QgS qz$3}tkQP_ v0v}&)Vo5Sg\}FlnOIS^jgEJyw_/:XN?^tZ~$3!F6è^U_%Z窔gdvДM9nQEflQE<M\=>kxn8vnyl5xp5Bp xϰ;GCZQ暹͏j>Z^ !LNb1=@}1ݻ?j<;c (v귛xuy΅>Vo3lĶ\]V)|ֽUtfOƽz$dyg{^ >Ѯ.HbRw8'52lV8( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9˻mcı4)cj!B{mY̏w:+-ќe#Ҝ*=dݺ8_Vk<rF$%o3p?J篕.hnD3FTTQ\ZV%YNyqNao-Lw[H';=xXX^G;\ihA.Y_î=xM-r³zk=v4^b)䷊h4,&M{SGiX(1:i\Ztd9[ZJ1,~`' c-kRl鼨نDjYj״ۋlo֟1#fH.{:]Z'wJEN)-7v̱%m [ /|(Y[oNAjwqq3[s9^'U%$] ,[d??(dn⨗<yw6eNmOcG\ *#Z/?Ш-.{')3P{wzf7e&(NnֶWhmF;ƸmK\&;1>B#fQRkFI9^z_3PjZ}&]VAing*TJFz2sӟYoS+@\آu#/h Bpm龖lpg79P?Z,KhP\7fgV8]]^Q?_&xHΟ׾0<Cm7iiʱ J ;VE9 i^!}C(H;[+BjV;-ܹVk6Rv2K 4m9j~!si4dyUUl$Hld|`` `9jUW)jk~&tpNU}ktҰPbuQݎbx:Qc8 uTRtO* k욵Cw~m-[f(R63`qG<~&ѷ'ir  K[h^)\+(«d*ut{KmyYtG '#>;WE?g5nyu/kǿ}|Pգ}RtF(?'=ilˮ%杮A=Ɵ>2A<StդKastelr@Zд$XV0Aފ)Itc98'w{[[o'#Z/?Ыvv|mvqXZEqʤ=ѥx'c~T{}m(E?Ozl<9@~٩U$wwTVU!kjNڧS2Yzf.V; rxǩx{ .%GXLRp{Oy[r^]iZɫz-kEZ-RYUwJ4}Zt,yoԚѢntV5rRn.UoOJ-3v8#t(<E?ympn#*UgW .XxMc.3i,?>g fڣ;Ƶ]f]Ӆȋ$R; Qd+c,2w,&PG2HCUJ&"y x2Q\`QEx[] 2[7,r"ޞX]Z=>cDW+zu+ D\IFm\?u31 j6ԯкH,j*ךi ˤ?j"JQvs/#Ku,sMRۺϟױ1šsSI<κׅGYF'JUGy!{ 1CTe(@˗Q¿s]TƇ|Ek2Mea xedk,{2</AǞ1dLx`q +|]$K]j7=<a]»ەU?VU:g2|h sMErx{4[x_Y9yk yK}bObX.6\1;IdQ@Q@Q@Q@Q@Q@Q@Q@Q@B#h2jAA+zW?[\4[0I$G|Bd?*W7foNF*+tKY^@CWWmuݺMm"*sUO܉Ӝ>%bj(3 ( ( ( ( ( ( ( ( ( ( ( ( ( (2|E c< Kk+<)CҜ fkƳZK,W?>#h3Mb.vF~ 5kY:!!l`0_zծǗORugM)Y+;d5=@+:pgg}WD_?oΤaItU+Kfw`;?>|Gb~@/a~'5&Un-vkoRyKX.vLH] tj:f.u ĞIcJZ,:[`27vt5x..a HȄD@ 涩J4vsQN+JZh)"5";U|ShEq#9Qp8,oQ]Z_Y:b;{u9yYO}9ƌ$j}AR&$M-0leU׽EꖚU!pRoY}M!ImS8Sbu/@52_]:'JV,]}iͤ,2uEW:ŘԾZoݳ}9یgq\P+gе 4r籮  `pGtL<a&MvcG:IM4z'њQEqZMktY6}]6{uw$,2$Kʑ띘EmW s4نK8i6'ns5J.twǛ_*XfyoGWc}oYGwe'HS!;xB6dV#ʹQE"kX9 F˞;NTyf#2p.minz3ohV 6"3߀۶67c[sF&HRcx?^[%^]V#NRQz2J Idު[_CR&HA!<?KV]k+|4ߏ2[8q|d5%veizCbxSWN7{M<֫cR)[M:gw[R4ˊ4.SEm|Ҥ d$ bu*O# ϯC\f<Ao$\DF)^ E1"J*$)T}Ko}G7tzm_Ovf^\lPMAj:ͩnx$>W+aX&kI5 p{ XO4QX{Kgb50鶧syZXO7M*Ͳ0;=p 3Wܠ1Xp_ui ow#ȖQK_QR5O (:((((wM-.\vdۃ܊J?]q \؊Ό=79]8Ej- 1a;z8u OYl?][Idz׏:k=xӜU 7z]RiW&N$,} Σ^xGG/MWH,yRY[:u0eNxB|"Jkbu,{nm#1I"_G/'grt0WkH_ʰ2I?.?=k?UtfKʤX`Rͭvk*M*ea1`3˽ ]u%@w,{s${\60SVRmFh&\['dU|teFmp/8byf2rrcp:`洿=Ф⿆Y巒m(:p*k0YZO ۬v.#xn Yqr3p4neKKA4cyU0gvHI;uzWb<@tV#a)T1>YL|NNGԃEMѬ+(e5yEq8((((((( y~#Iv4x bI5/W>LN s/bO)SLHux o<02#<-ĠʫW-,E(WhQVL%'}֜I\1]NpBYZXN< f$jD'bM,nk*p'xjsKHv# ( ( ( ( ( ( ( ( ( ( ( ( ( (8n>&hb(Y q%X}kT(@;+O3"#e S(+Y[rQʤ%2g6,rO;\㴟[&Ll gCRgP _ drH4Ǜ,t#\Ж=&# |k?suk(JvkѵuoԷx}L 4y;v~V甿'Yz|Ż,@p w'ҺJƬv^GN ec{7k맓zإiM$Zgn>T_ˬx^fbƿ*AVj5KݣJkO3u+mcoo.bXeJ#'姄[ Ё"R ǧEX( zrik_m "ݻ_jsS\J~ƍmJB7;P/M2P[*;yhU{?svks_S/EthJ4U2ye'@ E/!ޠg# m"q*p:+.hZzgQp[i_2^[cRQ{XԴM.fЭ}ˌæ;}1]9#qmc<$g8Uvf[xO.4oی ӪcTlK"UqqOs[u3z^wvM6ԼR?2TM(@T%.!)*]jT}O#_KZ ^XFn/b|o nO,u46;qxsQKK=RwgUOvU֑d6/`g|'8k`.nVuUw}nmԹ_\I}22[KHAcZ6k0 m*4kYsd~5GIG=QDf,3T ǺHO NHt·(E;{jysf\HCXA2KrCICеhQY{Z3Ώѽ܈+kwE(Գqo^_ d8=<><vUҝ8̛v"j{9$t܌+iz z"a$⳴h))ZRuV註Fx&*R\63K~r;MIY2%- bCEiAr?%L3} tu U`z3@T X|t}M{E67ֵ~ $gXz'm/?,S`n zZݢSJ>*mS5o[ã^x}gѧi&O)L]QWZ4V2cR.-=(((((= q:Z3edBB v\24>G{kVi܈sY2y#u(ЬR.لhHBwwml~?kRϤJ yKq%˴me0~Uڥz\ȗ`֬KM^]xX0<KQEuJ)ms~%˶Ic@VQ"Uמ8":1@la&<`eINZ!duH#C 4WqglFHu8Ai6WP}ռ Ch%m $3,H<@Ny)>N$Y`o*hg Ҝ@#r+7 s`{K V4rKwc'@w[z MttW9 އ=no&N?u34\'z?;GEsfhC_7@w[z MttW9 އ=no&N?u34\'z?:7ծ/+[;ǼSíRҼ;Emk܋u$p=- o#+ lfkz]}64#X4S#Bų$wDhR7!̝+հ/d$c'߉<IsK -/.QlѺEe$t7 t.d[x&ڮȾdqye$ӱa9en[Oʸ7AjwzEȯ$@,Ȅ m@Oº]Gu4b(~o6VlX5mI4A(]Fqۊ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( KI-dvL5ŲIv\bڄ{.Ybx#{Vs:]j:r46'8aJTD$0ҶSӥoS\UIJWAEUQ@5 u:3@]o _z[ jvp]#Aqq2>G0C ? JoMqp 9<nK'k|Uq Dw|CiK<ojomז0m'2@2#84hxgn'o/W/Oɣ]* Jfm՘c%=~|ßD2mE=7F˻>ԯ1,LIo> o__VW*Եeb!k vwcQzԐxGӯ-/.QW{{dUYYPkߑ@?8>—O л:Qfگ4,_gi;tnHnʜ]usxW7)(:{Me j[I Y+~?_ƒײqw57,To3RHאW)rRC(I5k[/{a܇O"YQJ6䞛x 蚗[A^."˷^6txW7*ŗ4Hm[hWoo+US@|=2ek7TCi8+Eyf8OנH_V|4A4RHbpsXIp2>t4]xo1_]?4N3@v){]ú|dQW%C1ˎ(jUvf'bNwE!T=WqVpFF*(PҦ;ּ9hz.^5_21Hr@$tJ͟@K[<#칔c޺O 4 ۠UX؟W" vMf⮝÷qib?t>nQ{Ya"Aw - K#R6srzVt2M#Z%(ۛ;*( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( Pv pQ@%m:0,77++}_㍻`ge'X:(:xf~;X.O\)oڬ!VXf7+ 4R)n~ra;9Uh7o?6O9o9+~>X:Q |Fv# esگxS3 B {f?q//|l0T0MGm$Ȩ *z(@e1v_p{Mý^0o pBݎX{֖[lv6*V $}9o6:4 f[.!QyX\msm,iᅫtt{|'52MwMty$mLĊ7!L!X*IQ!l5x;vy<m*O\J3Нh&AVw> 'P~O5@fcٵ$v/!ZF=Y֍((?DžR4lں. n q_\ .clpp3]Gw "I#B?QXZ$4]$\JmrVj-NZՒƁL8c^kҴ"JeI<YO5R> bAcf;rI20vUnfxS(9(((((((((((((((((((((((((((((((((((((((((((((sž$\*=~R/F W9o\iW xIELokRq6 O_0ѫgX [0X.w+)Q4g^:XkZmo"WVО]؍LjFÝI Ϩ[a=HLow w pLp_Us }BaVp,1誣%OqW~#Cy&A=&_=3rvA/i~ / &69ߜ"9-;]>L;x#HPEs7TKwRn-HcZFXq]xPҖ(((~\xyZEvWsg0I\%LѱV(FG^p+u]X{6F-Xb03Yx/JдY \\a95tѫ&r/ #\G=nz(ڠW*Š($((((((((((((((((((((((((((((((((((((((((((((((O~񥶻2Z]<'!0G t"o<|YRk s;9CK4`BWm?jyMV>HZ k ;(?0@OO񍦅_<+!Gue9Hu* ԃ:-uFRB YY\y'$LV(((((((((((((((((((((((((((((((((((((((((((((((((dZxė$Nk@e^B܀O€6(=kͼO\\x}VOuJG#pvV7}=J$!"ZQHX<EvV0VQGy xvh'8V_i~(`:QH (?>em3gMsr|KԷiB'猏Q]E*R1ӳEb&m_Plf;e܄+rARЊ+A EPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPnšw7),;Ƹ%K%uo-sGC˭@&>"*ƫ0 7uحӌ~=092{WWP6~voM.WsGuMMHVr*!`?J467J&(-7c,Oַ8Դ0Fom%FvB\t^>|#l&umGd(ҭèJO<*1^~ENJ:$MO5̶/_1aW$%POT4]*DӭZ‘.y''=jzM+3' NjzXI6Ӭx˂6BFH9R;rjDŽt@xf O`H4c!O\psW4sxGZs_%iМȅ];8㓪|Dѭu(3 ~gҵ51 ;!XUԑ k45G[\A ^MadF (0G5ѕ<:HW]\׌huxPD쪌 Jv80J3wii&fmLa(vH|_A]>^jCT+'Ku}?QmB<[`Fb '_\<䘮F~eEU#\ 7B|̹)#+ zf591 rK^ r5 ֆ)!':V&Kx/E-g?kU\%~2rע+0 ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9GM(iz#]A= } KCYkZ[%R_B-pnchj|Iw5֨iPtO칿8ta%֨6vr:W.8ÿ -7} 3;5 &MOZ[[HFz$ oD<_ZOlI%ʬNc+H4$ɚ|ѱT1]qnC#2 OYfy$x$TpJB;P+⋅Ś ۛvC/Ѐ>b5c,qeA?4nEejH~#LkE̍ R^&RylI57UNnx74A=HR ǀ?sZi^֚q Mg,[Ъ;{P6bcE>yTrۘ$]U#56_<9 y侷`n'T<䎛 ׳G"̊e="MMGSu c]~6VZX%ݤwm~89 X_JkKx,᷁6E G_}K-zSaOX^(XQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQER)ku$I8h]bQ6`90 zA˘H+}]kgd(Nq!wfKxK_5vLԭ+fG;VI$hqykw.g;@# Hn4L9-%XC2u_Di#/|Uٴȴ!c$t][8njkjӄ6n!xԵ0\F֓>춥GZ <Ig`An_izmk [oXI%j% ͦKcb~b L< ;[#Vs]]vJA %jIZ|1I=,2̸FV7't\bIj&rK8`֓f7F9|A>hbI wS(8ywdWhSdnV9$m)T7 ,æVkH0Ǔ<1&EQpOq+c6Uv v+__+<r1'$UdzIvR]~^xwUy4ֲDUNҳzsY=;[mByfg Å!D8<28v_ T^^:yQ|Vn$@ Fz.a{^ SuK[+m"\ͧIWI8A КxZ\L̑ Oc>h~bi9|F:_R; B2|yQ"s m=.9-M}$]D8bJ t)VU$O`s_%m_FU9Wg]%2)mX=xOQGXCYgtkor+g g5Y}s@׶lmn돘{2q¥>ww%K=ļ- ,xJ>zhY9R|ˆ# ZZgҴotؠHԄuS '*d\&CxV?zC޴u h̷:}fkx3av Btړ1-4³BΕmk,nL̾swu? _꺦 Kr+x$߼[aÐ8‹ h̷6{Q gO$9V("嶸PB!I' 8V48Elz SaMr;̖<`oEPH9VuQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEsKEfúO0m"*p?:~]y ݫf94=X՘0^vI)$:M6cLػ8ZI2=kו|;k7&ݮpnGkWCyj$Ub^O>5X 圆Cr:c*o&=4щ+PQTc ]u/j0dagc ˑAY`\d"JG6-ZTiQMуHX"d = n5&o o򷕸 \sF:VQo5E"y'pGZ(+ .40xz] 䙞 9,8*4 Xt`3:|.Ӽ_]ڝMz qjZpw*F dp󎵏OKE;zBAj#%ʕ9$u|3ɡok-Is4xy'&2O@< qc:J7W%X*.y@GpSsF;|1a̖wPW*fr\C|];ͼ1捯~C(N=ր3wp5lV 'BO*DEjHčO%bmn|ѓ+ҼZ|S O CM:7Ii"0o3h>\$>hc2x2m^&ۋH;w (-%ՋW!Ef/l?:fQ^kOk[XxiJZ@YԼ2͕ן uZFdna#zV<QܓDv/oo&n@<+S?ﯭzbF66waq(U%9I#m2ء%==}1~>E(((((((( xMqj(#i<S|;WSu$j֩ťi/,3̋M9f %w?Rx385.m[C6Vi(2ˎAƀ:" |j#ᫍkNvkS:1ZC{k{$)9 }+(((((((((((((((((((((((((((((((((((Aid#$+oX!qѻ05Eyp|2\xz$$+#|6&6LU-kNSn"GKߘq*9ù2}Ɲ0Z$Qidhmʷ b:M:*\\X;J|GpIEQg*(PH.ƨj+Vx)PIUԎ=A2%lm96װ4Ieq@hh_ 4+TTkDPyX׾2ŏvRWCuoM ȥ]d0#>ة_ &{ &HJ7@$V Y!Tq.ҦI2GLvƱFrN RG8khth{[hV"AeQGn'ܧ$k c[^,`J37*D+sWQOum` %("F5U(((((((((4=?Ě<^Y\ ē[ǡ(AǶqMm/t,cu)#e 2nO H=ZTP; *NyVi3O+L 'fQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQE
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Hello This is sample data «küßî» “ЌύБЇ” 😀😉 😋
Hello This is sample data «küßî» “ЌύБЇ” 😀😉 😋
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/env python # # Sort large text files in a minimum amount of memory # import argparse import os class FileSplitter: BLOCK_FILENAME_FORMAT = "block_{0}.dat" def __init__(self, filename): self.filename = filename self.block_filenames = [] def write_block(self, data, block_number): filename = self.BLOCK_FILENAME_FORMAT.format(block_number) with open(filename, "w") as file: file.write(data) self.block_filenames.append(filename) def get_block_filenames(self): return self.block_filenames def split(self, block_size, sort_key=None): i = 0 with open(self.filename) as file: while True: lines = file.readlines(block_size) if lines == []: break if sort_key is None: lines.sort() else: lines.sort(key=sort_key) self.write_block("".join(lines), i) i += 1 def cleanup(self): map(lambda f: os.remove(f), self.block_filenames) class NWayMerge: def select(self, choices): min_index = -1 min_str = None for i in range(len(choices)): if min_str is None or choices[i] < min_str: min_index = i return min_index class FilesArray: def __init__(self, files): self.files = files self.empty = set() self.num_buffers = len(files) self.buffers = {i: None for i in range(self.num_buffers)} def get_dict(self): return { i: self.buffers[i] for i in range(self.num_buffers) if i not in self.empty } def refresh(self): for i in range(self.num_buffers): if self.buffers[i] is None and i not in self.empty: self.buffers[i] = self.files[i].readline() if self.buffers[i] == "": self.empty.add(i) self.files[i].close() if len(self.empty) == self.num_buffers: return False return True def unshift(self, index): value = self.buffers[index] self.buffers[index] = None return value class FileMerger: def __init__(self, merge_strategy): self.merge_strategy = merge_strategy def merge(self, filenames, outfilename, buffer_size): buffers = FilesArray(self.get_file_handles(filenames, buffer_size)) with open(outfilename, "w", buffer_size) as outfile: while buffers.refresh(): min_index = self.merge_strategy.select(buffers.get_dict()) outfile.write(buffers.unshift(min_index)) def get_file_handles(self, filenames, buffer_size): files = {} for i in range(len(filenames)): files[i] = open(filenames[i], "r", buffer_size) return files class ExternalSort: def __init__(self, block_size): self.block_size = block_size def sort(self, filename, sort_key=None): num_blocks = self.get_number_blocks(filename, self.block_size) splitter = FileSplitter(filename) splitter.split(self.block_size, sort_key) merger = FileMerger(NWayMerge()) buffer_size = self.block_size / (num_blocks + 1) merger.merge(splitter.get_block_filenames(), filename + ".out", buffer_size) splitter.cleanup() def get_number_blocks(self, filename, block_size): return (os.stat(filename).st_size / block_size) + 1 def parse_memory(string): if string[-1].lower() == "k": return int(string[:-1]) * 1024 elif string[-1].lower() == "m": return int(string[:-1]) * 1024 * 1024 elif string[-1].lower() == "g": return int(string[:-1]) * 1024 * 1024 * 1024 else: return int(string) def main(): parser = argparse.ArgumentParser() parser.add_argument( "-m", "--mem", help="amount of memory to use for sorting", default="100M" ) parser.add_argument( "filename", metavar="<filename>", nargs=1, help="name of file to sort" ) args = parser.parse_args() sorter = ExternalSort(parse_memory(args.mem)) sorter.sort(args.filename[0]) if __name__ == "__main__": main()
#!/usr/bin/env python # # Sort large text files in a minimum amount of memory # import argparse import os class FileSplitter: BLOCK_FILENAME_FORMAT = "block_{0}.dat" def __init__(self, filename): self.filename = filename self.block_filenames = [] def write_block(self, data, block_number): filename = self.BLOCK_FILENAME_FORMAT.format(block_number) with open(filename, "w") as file: file.write(data) self.block_filenames.append(filename) def get_block_filenames(self): return self.block_filenames def split(self, block_size, sort_key=None): i = 0 with open(self.filename) as file: while True: lines = file.readlines(block_size) if lines == []: break if sort_key is None: lines.sort() else: lines.sort(key=sort_key) self.write_block("".join(lines), i) i += 1 def cleanup(self): map(lambda f: os.remove(f), self.block_filenames) class NWayMerge: def select(self, choices): min_index = -1 min_str = None for i in range(len(choices)): if min_str is None or choices[i] < min_str: min_index = i return min_index class FilesArray: def __init__(self, files): self.files = files self.empty = set() self.num_buffers = len(files) self.buffers = {i: None for i in range(self.num_buffers)} def get_dict(self): return { i: self.buffers[i] for i in range(self.num_buffers) if i not in self.empty } def refresh(self): for i in range(self.num_buffers): if self.buffers[i] is None and i not in self.empty: self.buffers[i] = self.files[i].readline() if self.buffers[i] == "": self.empty.add(i) self.files[i].close() if len(self.empty) == self.num_buffers: return False return True def unshift(self, index): value = self.buffers[index] self.buffers[index] = None return value class FileMerger: def __init__(self, merge_strategy): self.merge_strategy = merge_strategy def merge(self, filenames, outfilename, buffer_size): buffers = FilesArray(self.get_file_handles(filenames, buffer_size)) with open(outfilename, "w", buffer_size) as outfile: while buffers.refresh(): min_index = self.merge_strategy.select(buffers.get_dict()) outfile.write(buffers.unshift(min_index)) def get_file_handles(self, filenames, buffer_size): files = {} for i in range(len(filenames)): files[i] = open(filenames[i], "r", buffer_size) return files class ExternalSort: def __init__(self, block_size): self.block_size = block_size def sort(self, filename, sort_key=None): num_blocks = self.get_number_blocks(filename, self.block_size) splitter = FileSplitter(filename) splitter.split(self.block_size, sort_key) merger = FileMerger(NWayMerge()) buffer_size = self.block_size / (num_blocks + 1) merger.merge(splitter.get_block_filenames(), filename + ".out", buffer_size) splitter.cleanup() def get_number_blocks(self, filename, block_size): return (os.stat(filename).st_size / block_size) + 1 def parse_memory(string): if string[-1].lower() == "k": return int(string[:-1]) * 1024 elif string[-1].lower() == "m": return int(string[:-1]) * 1024 * 1024 elif string[-1].lower() == "g": return int(string[:-1]) * 1024 * 1024 * 1024 else: return int(string) def main(): parser = argparse.ArgumentParser() parser.add_argument( "-m", "--mem", help="amount of memory to use for sorting", default="100M" ) parser.add_argument( "filename", metavar="<filename>", nargs=1, help="name of file to sort" ) args = parser.parse_args() sorter = ExternalSort(parse_memory(args.mem)) sorter.sort(args.filename[0]) if __name__ == "__main__": main()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations class Node: def __init__(self, data=None): self.data = data self.next = None def __repr__(self): """Returns a visual representation of the node and all its following nodes.""" string_rep = [] temp = self while temp: string_rep.append(f"{temp.data}") temp = temp.next return "->".join(string_rep) def make_linked_list(elements_list: list): """Creates a Linked List from the elements of the given sequence (list/tuple) and returns the head of the Linked List. >>> make_linked_list([]) Traceback (most recent call last): ... Exception: The Elements List is empty >>> make_linked_list([7]) 7 >>> make_linked_list(['abc']) abc >>> make_linked_list([7, 25]) 7->25 """ if not elements_list: raise Exception("The Elements List is empty") current = head = Node(elements_list[0]) for i in range(1, len(elements_list)): current.next = Node(elements_list[i]) current = current.next return head def print_reverse(head_node: Node) -> None: """Prints the elements of the given Linked List in reverse order >>> print_reverse([]) >>> linked_list = make_linked_list([69, 88, 73]) >>> print_reverse(linked_list) 73 88 69 """ if head_node is not None and isinstance(head_node, Node): print_reverse(head_node.next) print(head_node.data) def main(): from doctest import testmod testmod() linked_list = make_linked_list([14, 52, 14, 12, 43]) print("Linked List:") print(linked_list) print("Elements in Reverse:") print_reverse(linked_list) if __name__ == "__main__": main()
from __future__ import annotations class Node: def __init__(self, data=None): self.data = data self.next = None def __repr__(self): """Returns a visual representation of the node and all its following nodes.""" string_rep = [] temp = self while temp: string_rep.append(f"{temp.data}") temp = temp.next return "->".join(string_rep) def make_linked_list(elements_list: list): """Creates a Linked List from the elements of the given sequence (list/tuple) and returns the head of the Linked List. >>> make_linked_list([]) Traceback (most recent call last): ... Exception: The Elements List is empty >>> make_linked_list([7]) 7 >>> make_linked_list(['abc']) abc >>> make_linked_list([7, 25]) 7->25 """ if not elements_list: raise Exception("The Elements List is empty") current = head = Node(elements_list[0]) for i in range(1, len(elements_list)): current.next = Node(elements_list[i]) current = current.next return head def print_reverse(head_node: Node) -> None: """Prints the elements of the given Linked List in reverse order >>> print_reverse([]) >>> linked_list = make_linked_list([69, 88, 73]) >>> print_reverse(linked_list) 73 88 69 """ if head_node is not None and isinstance(head_node, Node): print_reverse(head_node.next) print(head_node.data) def main(): from doctest import testmod testmod() linked_list = make_linked_list([14, 52, 14, 12, 43]) print("Linked List:") print(linked_list) print("Elements in Reverse:") print_reverse(linked_list) if __name__ == "__main__": main()
-1
TheAlgorithms/Python
5,638
Add pyupgrade to pre-commit
### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2021-10-28T14:37:17Z"
"2021-10-28T14:45:59Z"
70368a757e8d37b9f3dd96af4ca535275cb39580
477cc3fe597fd931c742700284016b937c778fe1
Add pyupgrade to pre-commit. ### Describe your change: Add https://github.com/asottile/pyupgrade to pre-commit * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Introspective Sort is hybrid sort (Quick Sort + Heap Sort + Insertion Sort) if the size of the list is under 16, use insertion sort https://en.wikipedia.org/wiki/Introsort """ import math def insertion_sort(array: list, start: int = 0, end: int = 0) -> list: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> insertion_sort(array, 0, len(array)) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] """ end = end or len(array) for i in range(start, end): temp_index = i temp_index_value = array[i] while temp_index != start and temp_index_value < array[temp_index - 1]: array[temp_index] = array[temp_index - 1] temp_index -= 1 array[temp_index] = temp_index_value return array def heapify(array: list, index: int, heap_size: int) -> None: # Max Heap """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> heapify(array, len(array) // 2 ,len(array)) """ largest = index left_index = 2 * index + 1 # Left Node right_index = 2 * index + 2 # Right Node if left_index < heap_size and array[largest] < array[left_index]: largest = left_index if right_index < heap_size and array[largest] < array[right_index]: largest = right_index if largest != index: array[index], array[largest] = array[largest], array[index] heapify(array, largest, heap_size) def heap_sort(array: list) -> list: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> heap_sort(array) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] """ n = len(array) for i in range(n // 2, -1, -1): heapify(array, i, n) for i in range(n - 1, 0, -1): array[i], array[0] = array[0], array[i] heapify(array, 0, i) return array def median_of_3( array: list, first_index: int, middle_index: int, last_index: int ) -> int: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> median_of_3(array, 0, 0 + ((len(array) - 0) // 2) + 1, len(array) - 1) 12 """ if (array[first_index] > array[middle_index]) != ( array[first_index] > array[last_index] ): return array[first_index] elif (array[middle_index] > array[first_index]) != ( array[middle_index] > array[last_index] ): return array[middle_index] else: return array[last_index] def partition(array: list, low: int, high: int, pivot: int) -> int: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> partition(array, 0, len(array), 12) 8 """ i = low j = high while True: while array[i] < pivot: i += 1 j -= 1 while pivot < array[j]: j -= 1 if i >= j: return i array[i], array[j] = array[j], array[i] i += 1 def sort(array: list) -> list: """ :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> sort([4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12]) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] >>> sort([-1, -5, -3, -13, -44]) [-44, -13, -5, -3, -1] >>> sort([]) [] >>> sort([5]) [5] >>> sort([-3, 0, -7, 6, 23, -34]) [-34, -7, -3, 0, 6, 23] >>> sort([1.7, 1.0, 3.3, 2.1, 0.3 ]) [0.3, 1.0, 1.7, 2.1, 3.3] >>> sort(['d', 'a', 'b', 'e', 'c']) ['a', 'b', 'c', 'd', 'e'] """ if len(array) == 0: return array max_depth = 2 * math.ceil(math.log2(len(array))) size_threshold = 16 return intro_sort(array, 0, len(array), size_threshold, max_depth) def intro_sort( array: list, start: int, end: int, size_threshold: int, max_depth: int ) -> list: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> max_depth = 2 * math.ceil(math.log2(len(array))) >>> intro_sort(array, 0, len(array), 16, max_depth) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] """ while end - start > size_threshold: if max_depth == 0: return heap_sort(array) max_depth -= 1 pivot = median_of_3(array, start, start + ((end - start) // 2) + 1, end - 1) p = partition(array, start, end, pivot) intro_sort(array, p, end, size_threshold, max_depth) end = p return insertion_sort(array, start, end) if __name__ == "__main__": import doctest doctest.testmod() user_input = input("Enter numbers separated by a comma : ").strip() unsorted = [float(item) for item in user_input.split(",")] print(sort(unsorted))
""" Introspective Sort is hybrid sort (Quick Sort + Heap Sort + Insertion Sort) if the size of the list is under 16, use insertion sort https://en.wikipedia.org/wiki/Introsort """ import math def insertion_sort(array: list, start: int = 0, end: int = 0) -> list: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> insertion_sort(array, 0, len(array)) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] """ end = end or len(array) for i in range(start, end): temp_index = i temp_index_value = array[i] while temp_index != start and temp_index_value < array[temp_index - 1]: array[temp_index] = array[temp_index - 1] temp_index -= 1 array[temp_index] = temp_index_value return array def heapify(array: list, index: int, heap_size: int) -> None: # Max Heap """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> heapify(array, len(array) // 2 ,len(array)) """ largest = index left_index = 2 * index + 1 # Left Node right_index = 2 * index + 2 # Right Node if left_index < heap_size and array[largest] < array[left_index]: largest = left_index if right_index < heap_size and array[largest] < array[right_index]: largest = right_index if largest != index: array[index], array[largest] = array[largest], array[index] heapify(array, largest, heap_size) def heap_sort(array: list) -> list: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> heap_sort(array) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] """ n = len(array) for i in range(n // 2, -1, -1): heapify(array, i, n) for i in range(n - 1, 0, -1): array[i], array[0] = array[0], array[i] heapify(array, 0, i) return array def median_of_3( array: list, first_index: int, middle_index: int, last_index: int ) -> int: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> median_of_3(array, 0, 0 + ((len(array) - 0) // 2) + 1, len(array) - 1) 12 """ if (array[first_index] > array[middle_index]) != ( array[first_index] > array[last_index] ): return array[first_index] elif (array[middle_index] > array[first_index]) != ( array[middle_index] > array[last_index] ): return array[middle_index] else: return array[last_index] def partition(array: list, low: int, high: int, pivot: int) -> int: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> partition(array, 0, len(array), 12) 8 """ i = low j = high while True: while array[i] < pivot: i += 1 j -= 1 while pivot < array[j]: j -= 1 if i >= j: return i array[i], array[j] = array[j], array[i] i += 1 def sort(array: list) -> list: """ :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> sort([4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12]) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] >>> sort([-1, -5, -3, -13, -44]) [-44, -13, -5, -3, -1] >>> sort([]) [] >>> sort([5]) [5] >>> sort([-3, 0, -7, 6, 23, -34]) [-34, -7, -3, 0, 6, 23] >>> sort([1.7, 1.0, 3.3, 2.1, 0.3 ]) [0.3, 1.0, 1.7, 2.1, 3.3] >>> sort(['d', 'a', 'b', 'e', 'c']) ['a', 'b', 'c', 'd', 'e'] """ if len(array) == 0: return array max_depth = 2 * math.ceil(math.log2(len(array))) size_threshold = 16 return intro_sort(array, 0, len(array), size_threshold, max_depth) def intro_sort( array: list, start: int, end: int, size_threshold: int, max_depth: int ) -> list: """ >>> array = [4, 2, 6, 8, 1, 7, 8, 22, 14, 56, 27, 79, 23, 45, 14, 12] >>> max_depth = 2 * math.ceil(math.log2(len(array))) >>> intro_sort(array, 0, len(array), 16, max_depth) [1, 2, 4, 6, 7, 8, 8, 12, 14, 14, 22, 23, 27, 45, 56, 79] """ while end - start > size_threshold: if max_depth == 0: return heap_sort(array) max_depth -= 1 pivot = median_of_3(array, start, start + ((end - start) // 2) + 1, end - 1) p = partition(array, start, end, pivot) intro_sort(array, p, end, size_threshold, max_depth) end = p return insertion_sort(array, start, end) if __name__ == "__main__": import doctest doctest.testmod() user_input = input("Enter numbers separated by a comma : ").strip() unsorted = [float(item) for item in user_input.split(",")] print(sort(unsorted))
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" https://en.wikipedia.org/wiki/Burrows%E2%80%93Wheeler_transform The Burrows–Wheeler transform (BWT, also called block-sorting compression) rearranges a character string into runs of similar characters. This is useful for compression, since it tends to be easy to compress a string that has runs of repeated characters by techniques such as move-to-front transform and run-length encoding. More importantly, the transformation is reversible, without needing to store any additional data except the position of the first original character. The BWT is thus a "free" method of improving the efficiency of text compression algorithms, costing only some extra computation. """ from __future__ import annotations def all_rotations(s: str) -> list[str]: """ :param s: The string that will be rotated len(s) times. :return: A list with the rotations. :raises TypeError: If s is not an instance of str. Examples: >>> all_rotations("^BANANA|") # doctest: +NORMALIZE_WHITESPACE ['^BANANA|', 'BANANA|^', 'ANANA|^B', 'NANA|^BA', 'ANA|^BAN', 'NA|^BANA', 'A|^BANAN', '|^BANANA'] >>> all_rotations("a_asa_da_casa") # doctest: +NORMALIZE_WHITESPACE ['a_asa_da_casa', '_asa_da_casaa', 'asa_da_casaa_', 'sa_da_casaa_a', 'a_da_casaa_as', '_da_casaa_asa', 'da_casaa_asa_', 'a_casaa_asa_d', '_casaa_asa_da', 'casaa_asa_da_', 'asaa_asa_da_c', 'saa_asa_da_ca', 'aa_asa_da_cas'] >>> all_rotations("panamabanana") # doctest: +NORMALIZE_WHITESPACE ['panamabanana', 'anamabananap', 'namabananapa', 'amabananapan', 'mabananapana', 'abananapanam', 'bananapanama', 'ananapanamab', 'nanapanamaba', 'anapanamaban', 'napanamabana', 'apanamabanan'] >>> all_rotations(5) Traceback (most recent call last): ... TypeError: The parameter s type must be str. """ if not isinstance(s, str): raise TypeError("The parameter s type must be str.") return [s[i:] + s[:i] for i in range(len(s))] def bwt_transform(s: str) -> dict: """ :param s: The string that will be used at bwt algorithm :return: the string composed of the last char of each row of the ordered rotations and the index of the original string at ordered rotations list :raises TypeError: If the s parameter type is not str :raises ValueError: If the s parameter is empty Examples: >>> bwt_transform("^BANANA") {'bwt_string': 'BNN^AAA', 'idx_original_string': 6} >>> bwt_transform("a_asa_da_casa") {'bwt_string': 'aaaadss_c__aa', 'idx_original_string': 3} >>> bwt_transform("panamabanana") {'bwt_string': 'mnpbnnaaaaaa', 'idx_original_string': 11} >>> bwt_transform(4) Traceback (most recent call last): ... TypeError: The parameter s type must be str. >>> bwt_transform('') Traceback (most recent call last): ... ValueError: The parameter s must not be empty. """ if not isinstance(s, str): raise TypeError("The parameter s type must be str.") if not s: raise ValueError("The parameter s must not be empty.") rotations = all_rotations(s) rotations.sort() # sort the list of rotations in alphabetically order # make a string composed of the last char of each rotation return { "bwt_string": "".join([word[-1] for word in rotations]), "idx_original_string": rotations.index(s), } def reverse_bwt(bwt_string: str, idx_original_string: int) -> str: """ :param bwt_string: The string returned from bwt algorithm execution :param idx_original_string: A 0-based index of the string that was used to generate bwt_string at ordered rotations list :return: The string used to generate bwt_string when bwt was executed :raises TypeError: If the bwt_string parameter type is not str :raises ValueError: If the bwt_string parameter is empty :raises TypeError: If the idx_original_string type is not int or if not possible to cast it to int :raises ValueError: If the idx_original_string value is lower than 0 or greater than len(bwt_string) - 1 >>> reverse_bwt("BNN^AAA", 6) '^BANANA' >>> reverse_bwt("aaaadss_c__aa", 3) 'a_asa_da_casa' >>> reverse_bwt("mnpbnnaaaaaa", 11) 'panamabanana' >>> reverse_bwt(4, 11) Traceback (most recent call last): ... TypeError: The parameter bwt_string type must be str. >>> reverse_bwt("", 11) Traceback (most recent call last): ... ValueError: The parameter bwt_string must not be empty. >>> reverse_bwt("mnpbnnaaaaaa", "asd") # doctest: +NORMALIZE_WHITESPACE Traceback (most recent call last): ... TypeError: The parameter idx_original_string type must be int or passive of cast to int. >>> reverse_bwt("mnpbnnaaaaaa", -1) Traceback (most recent call last): ... ValueError: The parameter idx_original_string must not be lower than 0. >>> reverse_bwt("mnpbnnaaaaaa", 12) # doctest: +NORMALIZE_WHITESPACE Traceback (most recent call last): ... ValueError: The parameter idx_original_string must be lower than len(bwt_string). >>> reverse_bwt("mnpbnnaaaaaa", 11.0) 'panamabanana' >>> reverse_bwt("mnpbnnaaaaaa", 11.4) 'panamabanana' """ if not isinstance(bwt_string, str): raise TypeError("The parameter bwt_string type must be str.") if not bwt_string: raise ValueError("The parameter bwt_string must not be empty.") try: idx_original_string = int(idx_original_string) except ValueError: raise TypeError( "The parameter idx_original_string type must be int or passive" " of cast to int." ) if idx_original_string < 0: raise ValueError("The parameter idx_original_string must not be lower than 0.") if idx_original_string >= len(bwt_string): raise ValueError( "The parameter idx_original_string must be lower than" " len(bwt_string)." ) ordered_rotations = [""] * len(bwt_string) for x in range(len(bwt_string)): for i in range(len(bwt_string)): ordered_rotations[i] = bwt_string[i] + ordered_rotations[i] ordered_rotations.sort() return ordered_rotations[idx_original_string] if __name__ == "__main__": entry_msg = "Provide a string that I will generate its BWT transform: " s = input(entry_msg).strip() result = bwt_transform(s) print( f"Burrows Wheeler transform for string '{s}' results " f"in '{result['bwt_string']}'" ) original_string = reverse_bwt(result["bwt_string"], result["idx_original_string"]) print( f"Reversing Burrows Wheeler transform for entry '{result['bwt_string']}' " f"we get original string '{original_string}'" )
""" https://en.wikipedia.org/wiki/Burrows%E2%80%93Wheeler_transform The Burrows–Wheeler transform (BWT, also called block-sorting compression) rearranges a character string into runs of similar characters. This is useful for compression, since it tends to be easy to compress a string that has runs of repeated characters by techniques such as move-to-front transform and run-length encoding. More importantly, the transformation is reversible, without needing to store any additional data except the position of the first original character. The BWT is thus a "free" method of improving the efficiency of text compression algorithms, costing only some extra computation. """ from __future__ import annotations from typing import TypedDict class BWTTransformDict(TypedDict): bwt_string: str idx_original_string: int def all_rotations(s: str) -> list[str]: """ :param s: The string that will be rotated len(s) times. :return: A list with the rotations. :raises TypeError: If s is not an instance of str. Examples: >>> all_rotations("^BANANA|") # doctest: +NORMALIZE_WHITESPACE ['^BANANA|', 'BANANA|^', 'ANANA|^B', 'NANA|^BA', 'ANA|^BAN', 'NA|^BANA', 'A|^BANAN', '|^BANANA'] >>> all_rotations("a_asa_da_casa") # doctest: +NORMALIZE_WHITESPACE ['a_asa_da_casa', '_asa_da_casaa', 'asa_da_casaa_', 'sa_da_casaa_a', 'a_da_casaa_as', '_da_casaa_asa', 'da_casaa_asa_', 'a_casaa_asa_d', '_casaa_asa_da', 'casaa_asa_da_', 'asaa_asa_da_c', 'saa_asa_da_ca', 'aa_asa_da_cas'] >>> all_rotations("panamabanana") # doctest: +NORMALIZE_WHITESPACE ['panamabanana', 'anamabananap', 'namabananapa', 'amabananapan', 'mabananapana', 'abananapanam', 'bananapanama', 'ananapanamab', 'nanapanamaba', 'anapanamaban', 'napanamabana', 'apanamabanan'] >>> all_rotations(5) Traceback (most recent call last): ... TypeError: The parameter s type must be str. """ if not isinstance(s, str): raise TypeError("The parameter s type must be str.") return [s[i:] + s[:i] for i in range(len(s))] def bwt_transform(s: str) -> BWTTransformDict: """ :param s: The string that will be used at bwt algorithm :return: the string composed of the last char of each row of the ordered rotations and the index of the original string at ordered rotations list :raises TypeError: If the s parameter type is not str :raises ValueError: If the s parameter is empty Examples: >>> bwt_transform("^BANANA") {'bwt_string': 'BNN^AAA', 'idx_original_string': 6} >>> bwt_transform("a_asa_da_casa") {'bwt_string': 'aaaadss_c__aa', 'idx_original_string': 3} >>> bwt_transform("panamabanana") {'bwt_string': 'mnpbnnaaaaaa', 'idx_original_string': 11} >>> bwt_transform(4) Traceback (most recent call last): ... TypeError: The parameter s type must be str. >>> bwt_transform('') Traceback (most recent call last): ... ValueError: The parameter s must not be empty. """ if not isinstance(s, str): raise TypeError("The parameter s type must be str.") if not s: raise ValueError("The parameter s must not be empty.") rotations = all_rotations(s) rotations.sort() # sort the list of rotations in alphabetically order # make a string composed of the last char of each rotation response: BWTTransformDict = { "bwt_string": "".join([word[-1] for word in rotations]), "idx_original_string": rotations.index(s), } return response def reverse_bwt(bwt_string: str, idx_original_string: int) -> str: """ :param bwt_string: The string returned from bwt algorithm execution :param idx_original_string: A 0-based index of the string that was used to generate bwt_string at ordered rotations list :return: The string used to generate bwt_string when bwt was executed :raises TypeError: If the bwt_string parameter type is not str :raises ValueError: If the bwt_string parameter is empty :raises TypeError: If the idx_original_string type is not int or if not possible to cast it to int :raises ValueError: If the idx_original_string value is lower than 0 or greater than len(bwt_string) - 1 >>> reverse_bwt("BNN^AAA", 6) '^BANANA' >>> reverse_bwt("aaaadss_c__aa", 3) 'a_asa_da_casa' >>> reverse_bwt("mnpbnnaaaaaa", 11) 'panamabanana' >>> reverse_bwt(4, 11) Traceback (most recent call last): ... TypeError: The parameter bwt_string type must be str. >>> reverse_bwt("", 11) Traceback (most recent call last): ... ValueError: The parameter bwt_string must not be empty. >>> reverse_bwt("mnpbnnaaaaaa", "asd") # doctest: +NORMALIZE_WHITESPACE Traceback (most recent call last): ... TypeError: The parameter idx_original_string type must be int or passive of cast to int. >>> reverse_bwt("mnpbnnaaaaaa", -1) Traceback (most recent call last): ... ValueError: The parameter idx_original_string must not be lower than 0. >>> reverse_bwt("mnpbnnaaaaaa", 12) # doctest: +NORMALIZE_WHITESPACE Traceback (most recent call last): ... ValueError: The parameter idx_original_string must be lower than len(bwt_string). >>> reverse_bwt("mnpbnnaaaaaa", 11.0) 'panamabanana' >>> reverse_bwt("mnpbnnaaaaaa", 11.4) 'panamabanana' """ if not isinstance(bwt_string, str): raise TypeError("The parameter bwt_string type must be str.") if not bwt_string: raise ValueError("The parameter bwt_string must not be empty.") try: idx_original_string = int(idx_original_string) except ValueError: raise TypeError( "The parameter idx_original_string type must be int or passive" " of cast to int." ) if idx_original_string < 0: raise ValueError("The parameter idx_original_string must not be lower than 0.") if idx_original_string >= len(bwt_string): raise ValueError( "The parameter idx_original_string must be lower than" " len(bwt_string)." ) ordered_rotations = [""] * len(bwt_string) for x in range(len(bwt_string)): for i in range(len(bwt_string)): ordered_rotations[i] = bwt_string[i] + ordered_rotations[i] ordered_rotations.sort() return ordered_rotations[idx_original_string] if __name__ == "__main__": entry_msg = "Provide a string that I will generate its BWT transform: " s = input(entry_msg).strip() result = bwt_transform(s) print( f"Burrows Wheeler transform for string '{s}' results " f"in '{result['bwt_string']}'" ) original_string = reverse_bwt(result["bwt_string"], result["idx_original_string"]) print( f"Reversing Burrows Wheeler transform for entry '{result['bwt_string']}' " f"we get original string '{original_string}'" )
1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import sys class Letter: def __init__(self, letter, freq): self.letter = letter self.freq = freq self.bitstring = {} def __repr__(self): return f"{self.letter}:{self.freq}" class TreeNode: def __init__(self, freq, left, right): self.freq = freq self.left = left self.right = right def parse_file(file_path): """ Read the file and build a dict of all letters and their frequencies, then convert the dict into a list of Letters. """ chars = {} with open(file_path) as f: while True: c = f.read(1) if not c: break chars[c] = chars[c] + 1 if c in chars.keys() else 1 return sorted((Letter(c, f) for c, f in chars.items()), key=lambda l: l.freq) def build_tree(letters): """ Run through the list of Letters and build the min heap for the Huffman Tree. """ while len(letters) > 1: left = letters.pop(0) right = letters.pop(0) total_freq = left.freq + right.freq node = TreeNode(total_freq, left, right) letters.append(node) letters.sort(key=lambda l: l.freq) return letters[0] def traverse_tree(root, bitstring): """ Recursively traverse the Huffman Tree to set each Letter's bitstring dictionary, and return the list of Letters """ if type(root) is Letter: root.bitstring[root.letter] = bitstring return [root] letters = [] letters += traverse_tree(root.left, bitstring + "0") letters += traverse_tree(root.right, bitstring + "1") return letters def huffman(file_path): """ Parse the file, build the tree, then run through the file again, using the letters dictionary to find and print out the bitstring for each letter. """ letters_list = parse_file(file_path) root = build_tree(letters_list) letters = { k: v for letter in traverse_tree(root, "") for k, v in letter.bitstring.items() } print(f"Huffman Coding of {file_path}: ") with open(file_path) as f: while True: c = f.read(1) if not c: break print(letters[c], end=" ") print() if __name__ == "__main__": # pass the file path to the huffman function huffman(sys.argv[1])
from __future__ import annotations import sys class Letter: def __init__(self, letter: str, freq: int): self.letter: str = letter self.freq: int = freq self.bitstring: dict[str, str] = {} def __repr__(self) -> str: return f"{self.letter}:{self.freq}" class TreeNode: def __init__(self, freq: int, left: Letter | TreeNode, right: Letter | TreeNode): self.freq: int = freq self.left: Letter | TreeNode = left self.right: Letter | TreeNode = right def parse_file(file_path: str) -> list[Letter]: """ Read the file and build a dict of all letters and their frequencies, then convert the dict into a list of Letters. """ chars: dict[str, int] = {} with open(file_path) as f: while True: c = f.read(1) if not c: break chars[c] = chars[c] + 1 if c in chars.keys() else 1 return sorted((Letter(c, f) for c, f in chars.items()), key=lambda l: l.freq) def build_tree(letters: list[Letter]) -> Letter | TreeNode: """ Run through the list of Letters and build the min heap for the Huffman Tree. """ response: list[Letter | TreeNode] = letters # type: ignore while len(response) > 1: left = response.pop(0) right = response.pop(0) total_freq = left.freq + right.freq node = TreeNode(total_freq, left, right) response.append(node) response.sort(key=lambda l: l.freq) return response[0] def traverse_tree(root: Letter | TreeNode, bitstring: str) -> list[Letter]: """ Recursively traverse the Huffman Tree to set each Letter's bitstring dictionary, and return the list of Letters """ if type(root) is Letter: root.bitstring[root.letter] = bitstring return [root] treenode: TreeNode = root # type: ignore letters = [] letters += traverse_tree(treenode.left, bitstring + "0") letters += traverse_tree(treenode.right, bitstring + "1") return letters def huffman(file_path: str) -> None: """ Parse the file, build the tree, then run through the file again, using the letters dictionary to find and print out the bitstring for each letter. """ letters_list = parse_file(file_path) root = build_tree(letters_list) letters = { k: v for letter in traverse_tree(root, "") for k, v in letter.bitstring.items() } print(f"Huffman Coding of {file_path}: ") with open(file_path) as f: while True: c = f.read(1) if not c: break print(letters[c], end=" ") print() if __name__ == "__main__": # pass the file path to the huffman function huffman(sys.argv[1])
1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" One of the several implementations of Lempel–Ziv–Welch compression algorithm https://en.wikipedia.org/wiki/Lempel%E2%80%93Ziv%E2%80%93Welch """ import math import os import sys def read_file_binary(file_path: str) -> str: """ Reads given file as bytes and returns them as a long string """ result = "" try: with open(file_path, "rb") as binary_file: data = binary_file.read() for dat in data: curr_byte = f"{dat:08b}" result += curr_byte return result except OSError: print("File not accessible") sys.exit() def add_key_to_lexicon( lexicon: dict, curr_string: str, index: int, last_match_id: str ) -> None: """ Adds new strings (curr_string + "0", curr_string + "1") to the lexicon """ lexicon.pop(curr_string) lexicon[curr_string + "0"] = last_match_id if math.log2(index).is_integer(): for curr_key in lexicon: lexicon[curr_key] = "0" + lexicon[curr_key] lexicon[curr_string + "1"] = bin(index)[2:] def compress_data(data_bits: str) -> str: """ Compresses given data_bits using Lempel–Ziv–Welch compression algorithm and returns the result as a string """ lexicon = {"0": "0", "1": "1"} result, curr_string = "", "" index = len(lexicon) for i in range(len(data_bits)): curr_string += data_bits[i] if curr_string not in lexicon: continue last_match_id = lexicon[curr_string] result += last_match_id add_key_to_lexicon(lexicon, curr_string, index, last_match_id) index += 1 curr_string = "" while curr_string != "" and curr_string not in lexicon: curr_string += "0" if curr_string != "": last_match_id = lexicon[curr_string] result += last_match_id return result def add_file_length(source_path: str, compressed: str) -> str: """ Adds given file's length in front (using Elias gamma coding) of the compressed string """ file_length = os.path.getsize(source_path) file_length_binary = bin(file_length)[2:] length_length = len(file_length_binary) return "0" * (length_length - 1) + file_length_binary + compressed def write_file_binary(file_path: str, to_write: str) -> None: """ Writes given to_write string (should only consist of 0's and 1's) as bytes in the file """ byte_length = 8 try: with open(file_path, "wb") as opened_file: result_byte_array = [ to_write[i : i + byte_length] for i in range(0, len(to_write), byte_length) ] if len(result_byte_array[-1]) % byte_length == 0: result_byte_array.append("10000000") else: result_byte_array[-1] += "1" + "0" * ( byte_length - len(result_byte_array[-1]) - 1 ) for elem in result_byte_array: opened_file.write(int(elem, 2).to_bytes(1, byteorder="big")) except OSError: print("File not accessible") sys.exit() def compress(source_path, destination_path: str) -> None: """ Reads source file, compresses it and writes the compressed result in destination file """ data_bits = read_file_binary(source_path) compressed = compress_data(data_bits) compressed = add_file_length(source_path, compressed) write_file_binary(destination_path, compressed) if __name__ == "__main__": compress(sys.argv[1], sys.argv[2])
""" One of the several implementations of Lempel–Ziv–Welch compression algorithm https://en.wikipedia.org/wiki/Lempel%E2%80%93Ziv%E2%80%93Welch """ import math import os import sys def read_file_binary(file_path: str) -> str: """ Reads given file as bytes and returns them as a long string """ result = "" try: with open(file_path, "rb") as binary_file: data = binary_file.read() for dat in data: curr_byte = f"{dat:08b}" result += curr_byte return result except OSError: print("File not accessible") sys.exit() def add_key_to_lexicon( lexicon: dict[str, str], curr_string: str, index: int, last_match_id: str ) -> None: """ Adds new strings (curr_string + "0", curr_string + "1") to the lexicon """ lexicon.pop(curr_string) lexicon[curr_string + "0"] = last_match_id if math.log2(index).is_integer(): for curr_key in lexicon: lexicon[curr_key] = "0" + lexicon[curr_key] lexicon[curr_string + "1"] = bin(index)[2:] def compress_data(data_bits: str) -> str: """ Compresses given data_bits using Lempel–Ziv–Welch compression algorithm and returns the result as a string """ lexicon = {"0": "0", "1": "1"} result, curr_string = "", "" index = len(lexicon) for i in range(len(data_bits)): curr_string += data_bits[i] if curr_string not in lexicon: continue last_match_id = lexicon[curr_string] result += last_match_id add_key_to_lexicon(lexicon, curr_string, index, last_match_id) index += 1 curr_string = "" while curr_string != "" and curr_string not in lexicon: curr_string += "0" if curr_string != "": last_match_id = lexicon[curr_string] result += last_match_id return result def add_file_length(source_path: str, compressed: str) -> str: """ Adds given file's length in front (using Elias gamma coding) of the compressed string """ file_length = os.path.getsize(source_path) file_length_binary = bin(file_length)[2:] length_length = len(file_length_binary) return "0" * (length_length - 1) + file_length_binary + compressed def write_file_binary(file_path: str, to_write: str) -> None: """ Writes given to_write string (should only consist of 0's and 1's) as bytes in the file """ byte_length = 8 try: with open(file_path, "wb") as opened_file: result_byte_array = [ to_write[i : i + byte_length] for i in range(0, len(to_write), byte_length) ] if len(result_byte_array[-1]) % byte_length == 0: result_byte_array.append("10000000") else: result_byte_array[-1] += "1" + "0" * ( byte_length - len(result_byte_array[-1]) - 1 ) for elem in result_byte_array: opened_file.write(int(elem, 2).to_bytes(1, byteorder="big")) except OSError: print("File not accessible") sys.exit() def compress(source_path: str, destination_path: str) -> None: """ Reads source file, compresses it and writes the compressed result in destination file """ data_bits = read_file_binary(source_path) compressed = compress_data(data_bits) compressed = add_file_length(source_path, compressed) write_file_binary(destination_path, compressed) if __name__ == "__main__": compress(sys.argv[1], sys.argv[2])
1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Peak signal-to-noise ratio - PSNR https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio Source: https://tutorials.techonical.com/how-to-calculate-psnr-value-of-two-images-using-python """ import math import os import cv2 import numpy as np def psnr(original, contrast): mse = np.mean((original - contrast) ** 2) if mse == 0: return 100 PIXEL_MAX = 255.0 PSNR = 20 * math.log10(PIXEL_MAX / math.sqrt(mse)) return PSNR def main(): dir_path = os.path.dirname(os.path.realpath(__file__)) # Loading images (original image and compressed image) original = cv2.imread(os.path.join(dir_path, "image_data/original_image.png")) contrast = cv2.imread(os.path.join(dir_path, "image_data/compressed_image.png"), 1) original2 = cv2.imread(os.path.join(dir_path, "image_data/PSNR-example-base.png")) contrast2 = cv2.imread( os.path.join(dir_path, "image_data/PSNR-example-comp-10.jpg"), 1 ) # Value expected: 29.73dB print("-- First Test --") print(f"PSNR value is {psnr(original, contrast)} dB") # # Value expected: 31.53dB (Wikipedia Example) print("\n-- Second Test --") print(f"PSNR value is {psnr(original2, contrast2)} dB") if __name__ == "__main__": main()
""" Peak signal-to-noise ratio - PSNR https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio Source: https://tutorials.techonical.com/how-to-calculate-psnr-value-of-two-images-using-python """ import math import os import cv2 import numpy as np def psnr(original: float, contrast: float) -> float: mse = np.mean((original - contrast) ** 2) if mse == 0: return 100 PIXEL_MAX = 255.0 PSNR = 20 * math.log10(PIXEL_MAX / math.sqrt(mse)) return PSNR def main() -> None: dir_path = os.path.dirname(os.path.realpath(__file__)) # Loading images (original image and compressed image) original = cv2.imread(os.path.join(dir_path, "image_data/original_image.png")) contrast = cv2.imread(os.path.join(dir_path, "image_data/compressed_image.png"), 1) original2 = cv2.imread(os.path.join(dir_path, "image_data/PSNR-example-base.png")) contrast2 = cv2.imread( os.path.join(dir_path, "image_data/PSNR-example-comp-10.jpg"), 1 ) # Value expected: 29.73dB print("-- First Test --") print(f"PSNR value is {psnr(original, contrast)} dB") # # Value expected: 31.53dB (Wikipedia Example) print("\n-- Second Test --") print(f"PSNR value is {psnr(original2, contrast2)} dB") if __name__ == "__main__": main()
1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Given an array of integers and another integer target, we are required to find a triplet from the array such that it's sum is equal to the target. """ from __future__ import annotations from itertools import permutations from random import randint from timeit import repeat def make_dataset() -> tuple[list[int], int]: arr = [randint(-1000, 1000) for i in range(10)] r = randint(-5000, 5000) return (arr, r) dataset = make_dataset() def triplet_sum1(arr: list[int], target: int) -> tuple[int, ...]: """ Returns a triplet in the array with sum equal to target, else (0, 0, 0). >>> triplet_sum1([13, 29, 7, 23, 5], 35) (5, 7, 23) >>> triplet_sum1([37, 9, 19, 50, 44], 65) (9, 19, 37) >>> arr = [6, 47, 27, 1, 15] >>> target = 11 >>> triplet_sum1(arr, target) (0, 0, 0) """ for triplet in permutations(arr, 3): if sum(triplet) == target: return tuple(sorted(triplet)) return (0, 0, 0) def triplet_sum2(arr: list[int], target: int) -> tuple[int, int, int]: """ Returns a triplet in the array with sum equal to target, else (0, 0, 0). >>> triplet_sum2([13, 29, 7, 23, 5], 35) (5, 7, 23) >>> triplet_sum2([37, 9, 19, 50, 44], 65) (9, 19, 37) >>> arr = [6, 47, 27, 1, 15] >>> target = 11 >>> triplet_sum2(arr, target) (0, 0, 0) """ arr.sort() n = len(arr) for i in range(n - 1): left, right = i + 1, n - 1 while left < right: if arr[i] + arr[left] + arr[right] == target: return (arr[i], arr[left], arr[right]) elif arr[i] + arr[left] + arr[right] < target: left += 1 elif arr[i] + arr[left] + arr[right] > target: right -= 1 return (0, 0, 0) def solution_times() -> tuple[float, float]: setup_code = """ from __main__ import dataset, triplet_sum1, triplet_sum2 """ test_code1 = """ triplet_sum1(*dataset) """ test_code2 = """ triplet_sum2(*dataset) """ times1 = repeat(setup=setup_code, stmt=test_code1, repeat=5, number=10000) times2 = repeat(setup=setup_code, stmt=test_code2, repeat=5, number=10000) return (min(times1), min(times2)) if __name__ == "__main__": from doctest import testmod testmod() times = solution_times() print(f"The time for naive implementation is {times[0]}.") print(f"The time for optimized implementation is {times[1]}.")
""" Given an array of integers and another integer target, we are required to find a triplet from the array such that it's sum is equal to the target. """ from __future__ import annotations from itertools import permutations from random import randint from timeit import repeat def make_dataset() -> tuple[list[int], int]: arr = [randint(-1000, 1000) for i in range(10)] r = randint(-5000, 5000) return (arr, r) dataset = make_dataset() def triplet_sum1(arr: list[int], target: int) -> tuple[int, ...]: """ Returns a triplet in the array with sum equal to target, else (0, 0, 0). >>> triplet_sum1([13, 29, 7, 23, 5], 35) (5, 7, 23) >>> triplet_sum1([37, 9, 19, 50, 44], 65) (9, 19, 37) >>> arr = [6, 47, 27, 1, 15] >>> target = 11 >>> triplet_sum1(arr, target) (0, 0, 0) """ for triplet in permutations(arr, 3): if sum(triplet) == target: return tuple(sorted(triplet)) return (0, 0, 0) def triplet_sum2(arr: list[int], target: int) -> tuple[int, int, int]: """ Returns a triplet in the array with sum equal to target, else (0, 0, 0). >>> triplet_sum2([13, 29, 7, 23, 5], 35) (5, 7, 23) >>> triplet_sum2([37, 9, 19, 50, 44], 65) (9, 19, 37) >>> arr = [6, 47, 27, 1, 15] >>> target = 11 >>> triplet_sum2(arr, target) (0, 0, 0) """ arr.sort() n = len(arr) for i in range(n - 1): left, right = i + 1, n - 1 while left < right: if arr[i] + arr[left] + arr[right] == target: return (arr[i], arr[left], arr[right]) elif arr[i] + arr[left] + arr[right] < target: left += 1 elif arr[i] + arr[left] + arr[right] > target: right -= 1 return (0, 0, 0) def solution_times() -> tuple[float, float]: setup_code = """ from __main__ import dataset, triplet_sum1, triplet_sum2 """ test_code1 = """ triplet_sum1(*dataset) """ test_code2 = """ triplet_sum2(*dataset) """ times1 = repeat(setup=setup_code, stmt=test_code1, repeat=5, number=10000) times2 = repeat(setup=setup_code, stmt=test_code2, repeat=5, number=10000) return (min(times1), min(times2)) if __name__ == "__main__": from doctest import testmod testmod() times = solution_times() print(f"The time for naive implementation is {times[0]}.") print(f"The time for optimized implementation is {times[1]}.")
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
class Graph: def __init__(self, vertex): self.vertex = vertex self.graph = [[0] * vertex for i in range(vertex)] def add_edge(self, u, v): self.graph[u - 1][v - 1] = 1 self.graph[v - 1][u - 1] = 1 def show(self): for i in self.graph: for j in i: print(j, end=" ") print(" ") g = Graph(100) g.add_edge(1, 4) g.add_edge(4, 2) g.add_edge(4, 5) g.add_edge(2, 5) g.add_edge(5, 3) g.show()
class Graph: def __init__(self, vertex): self.vertex = vertex self.graph = [[0] * vertex for i in range(vertex)] def add_edge(self, u, v): self.graph[u - 1][v - 1] = 1 self.graph[v - 1][u - 1] = 1 def show(self): for i in self.graph: for j in i: print(j, end=" ") print(" ") g = Graph(100) g.add_edge(1, 4) g.add_edge(4, 2) g.add_edge(4, 5) g.add_edge(2, 5) g.add_edge(5, 3) g.show()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
if __name__ == "__main__": import socket # Import socket module sock = socket.socket() # Create a socket object host = socket.gethostname() # Get local machine name port = 12312 sock.connect((host, port)) sock.send(b"Hello server!") with open("Received_file", "wb") as out_file: print("File opened") print("Receiving data...") while True: data = sock.recv(1024) print(f"{data = }") if not data: break out_file.write(data) # Write data to a file print("Successfully got the file") sock.close() print("Connection closed")
if __name__ == "__main__": import socket # Import socket module sock = socket.socket() # Create a socket object host = socket.gethostname() # Get local machine name port = 12312 sock.connect((host, port)) sock.send(b"Hello server!") with open("Received_file", "wb") as out_file: print("File opened") print("Receiving data...") while True: data = sock.recv(1024) print(f"{data = }") if not data: break out_file.write(data) # Write data to a file print("Successfully got the file") sock.close() print("Connection closed")
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/env python3 """ Python program to translate to and from Morse code. https://en.wikipedia.org/wiki/Morse_code """ # fmt: off MORSE_CODE_DICT = { "A": ".-", "B": "-...", "C": "-.-.", "D": "-..", "E": ".", "F": "..-.", "G": "--.", "H": "....", "I": "..", "J": ".---", "K": "-.-", "L": ".-..", "M": "--", "N": "-.", "O": "---", "P": ".--.", "Q": "--.-", "R": ".-.", "S": "...", "T": "-", "U": "..-", "V": "...-", "W": ".--", "X": "-..-", "Y": "-.--", "Z": "--..", "1": ".----", "2": "..---", "3": "...--", "4": "....-", "5": ".....", "6": "-....", "7": "--...", "8": "---..", "9": "----.", "0": "-----", "&": ".-...", "@": ".--.-.", ":": "---...", ",": "--..--", ".": ".-.-.-", "'": ".----.", '"': ".-..-.", "?": "..--..", "/": "-..-.", "=": "-...-", "+": ".-.-.", "-": "-....-", "(": "-.--.", ")": "-.--.-", "!": "-.-.--", " ": "/" } # Exclamation mark is not in ITU-R recommendation # fmt: on REVERSE_DICT = {value: key for key, value in MORSE_CODE_DICT.items()} def encrypt(message: str) -> str: """ >>> encrypt("Sos!") '... --- ... -.-.--' >>> encrypt("SOS!") == encrypt("sos!") True """ return " ".join(MORSE_CODE_DICT[char] for char in message.upper()) def decrypt(message: str) -> str: """ >>> decrypt('... --- ... -.-.--') 'SOS!' """ return "".join(REVERSE_DICT[char] for char in message.split()) def main() -> None: """ >>> s = "".join(MORSE_CODE_DICT) >>> decrypt(encrypt(s)) == s True """ message = "Morse code here!" print(message) message = encrypt(message) print(message) message = decrypt(message) print(message) if __name__ == "__main__": main()
#!/usr/bin/env python3 """ Python program to translate to and from Morse code. https://en.wikipedia.org/wiki/Morse_code """ # fmt: off MORSE_CODE_DICT = { "A": ".-", "B": "-...", "C": "-.-.", "D": "-..", "E": ".", "F": "..-.", "G": "--.", "H": "....", "I": "..", "J": ".---", "K": "-.-", "L": ".-..", "M": "--", "N": "-.", "O": "---", "P": ".--.", "Q": "--.-", "R": ".-.", "S": "...", "T": "-", "U": "..-", "V": "...-", "W": ".--", "X": "-..-", "Y": "-.--", "Z": "--..", "1": ".----", "2": "..---", "3": "...--", "4": "....-", "5": ".....", "6": "-....", "7": "--...", "8": "---..", "9": "----.", "0": "-----", "&": ".-...", "@": ".--.-.", ":": "---...", ",": "--..--", ".": ".-.-.-", "'": ".----.", '"': ".-..-.", "?": "..--..", "/": "-..-.", "=": "-...-", "+": ".-.-.", "-": "-....-", "(": "-.--.", ")": "-.--.-", "!": "-.-.--", " ": "/" } # Exclamation mark is not in ITU-R recommendation # fmt: on REVERSE_DICT = {value: key for key, value in MORSE_CODE_DICT.items()} def encrypt(message: str) -> str: """ >>> encrypt("Sos!") '... --- ... -.-.--' >>> encrypt("SOS!") == encrypt("sos!") True """ return " ".join(MORSE_CODE_DICT[char] for char in message.upper()) def decrypt(message: str) -> str: """ >>> decrypt('... --- ... -.-.--') 'SOS!' """ return "".join(REVERSE_DICT[char] for char in message.split()) def main() -> None: """ >>> s = "".join(MORSE_CODE_DICT) >>> decrypt(encrypt(s)) == s True """ message = "Morse code here!" print(message) message = encrypt(message) print(message) message = decrypt(message) print(message) if __name__ == "__main__": main()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" The Mandelbrot set is the set of complex numbers "c" for which the series "z_(n+1) = z_n * z_n + c" does not diverge, i.e. remains bounded. Thus, a complex number "c" is a member of the Mandelbrot set if, when starting with "z_0 = 0" and applying the iteration repeatedly, the absolute value of "z_n" remains bounded for all "n > 0". Complex numbers can be written as "a + b*i": "a" is the real component, usually drawn on the x-axis, and "b*i" is the imaginary component, usually drawn on the y-axis. Most visualizations of the Mandelbrot set use a color-coding to indicate after how many steps in the series the numbers outside the set diverge. Images of the Mandelbrot set exhibit an elaborate and infinitely complicated boundary that reveals progressively ever-finer recursive detail at increasing magnifications, making the boundary of the Mandelbrot set a fractal curve. (description adapted from https://en.wikipedia.org/wiki/Mandelbrot_set ) (see also https://en.wikipedia.org/wiki/Plotting_algorithms_for_the_Mandelbrot_set ) """ import colorsys from PIL import Image # type: ignore def get_distance(x: float, y: float, max_step: int) -> float: """ Return the relative distance (= step/max_step) after which the complex number constituted by this x-y-pair diverges. Members of the Mandelbrot set do not diverge so their distance is 1. >>> get_distance(0, 0, 50) 1.0 >>> get_distance(0.5, 0.5, 50) 0.061224489795918366 >>> get_distance(2, 0, 50) 0.0 """ a = x b = y for step in range(max_step): a_new = a * a - b * b + x b = 2 * a * b + y a = a_new # divergence happens for all complex number with an absolute value # greater than 4 if a * a + b * b > 4: break return step / (max_step - 1) def get_black_and_white_rgb(distance: float) -> tuple: """ Black&white color-coding that ignores the relative distance. The Mandelbrot set is black, everything else is white. >>> get_black_and_white_rgb(0) (255, 255, 255) >>> get_black_and_white_rgb(0.5) (255, 255, 255) >>> get_black_and_white_rgb(1) (0, 0, 0) """ if distance == 1: return (0, 0, 0) else: return (255, 255, 255) def get_color_coded_rgb(distance: float) -> tuple: """ Color-coding taking the relative distance into account. The Mandelbrot set is black. >>> get_color_coded_rgb(0) (255, 0, 0) >>> get_color_coded_rgb(0.5) (0, 255, 255) >>> get_color_coded_rgb(1) (0, 0, 0) """ if distance == 1: return (0, 0, 0) else: return tuple(round(i * 255) for i in colorsys.hsv_to_rgb(distance, 1, 1)) def get_image( image_width: int = 800, image_height: int = 600, figure_center_x: float = -0.6, figure_center_y: float = 0, figure_width: float = 3.2, max_step: int = 50, use_distance_color_coding: bool = True, ) -> Image.Image: """ Function to generate the image of the Mandelbrot set. Two types of coordinates are used: image-coordinates that refer to the pixels and figure-coordinates that refer to the complex numbers inside and outside the Mandelbrot set. The figure-coordinates in the arguments of this function determine which section of the Mandelbrot set is viewed. The main area of the Mandelbrot set is roughly between "-1.5 < x < 0.5" and "-1 < y < 1" in the figure-coordinates. Commenting out tests that slow down pytest... # 13.35s call fractals/mandelbrot.py::mandelbrot.get_image # >>> get_image().load()[0,0] (255, 0, 0) # >>> get_image(use_distance_color_coding = False).load()[0,0] (255, 255, 255) """ img = Image.new("RGB", (image_width, image_height)) pixels = img.load() # loop through the image-coordinates for image_x in range(image_width): for image_y in range(image_height): # determine the figure-coordinates based on the image-coordinates figure_height = figure_width / image_width * image_height figure_x = figure_center_x + (image_x / image_width - 0.5) * figure_width figure_y = figure_center_y + (image_y / image_height - 0.5) * figure_height distance = get_distance(figure_x, figure_y, max_step) # color the corresponding pixel based on the selected coloring-function if use_distance_color_coding: pixels[image_x, image_y] = get_color_coded_rgb(distance) else: pixels[image_x, image_y] = get_black_and_white_rgb(distance) return img if __name__ == "__main__": import doctest doctest.testmod() # colored version, full figure img = get_image() # uncomment for colored version, different section, zoomed in # img = get_image(figure_center_x = -0.6, figure_center_y = -0.4, # figure_width = 0.8) # uncomment for black and white version, full figure # img = get_image(use_distance_color_coding = False) # uncomment to save the image # img.save("mandelbrot.png") img.show()
""" The Mandelbrot set is the set of complex numbers "c" for which the series "z_(n+1) = z_n * z_n + c" does not diverge, i.e. remains bounded. Thus, a complex number "c" is a member of the Mandelbrot set if, when starting with "z_0 = 0" and applying the iteration repeatedly, the absolute value of "z_n" remains bounded for all "n > 0". Complex numbers can be written as "a + b*i": "a" is the real component, usually drawn on the x-axis, and "b*i" is the imaginary component, usually drawn on the y-axis. Most visualizations of the Mandelbrot set use a color-coding to indicate after how many steps in the series the numbers outside the set diverge. Images of the Mandelbrot set exhibit an elaborate and infinitely complicated boundary that reveals progressively ever-finer recursive detail at increasing magnifications, making the boundary of the Mandelbrot set a fractal curve. (description adapted from https://en.wikipedia.org/wiki/Mandelbrot_set ) (see also https://en.wikipedia.org/wiki/Plotting_algorithms_for_the_Mandelbrot_set ) """ import colorsys from PIL import Image # type: ignore def get_distance(x: float, y: float, max_step: int) -> float: """ Return the relative distance (= step/max_step) after which the complex number constituted by this x-y-pair diverges. Members of the Mandelbrot set do not diverge so their distance is 1. >>> get_distance(0, 0, 50) 1.0 >>> get_distance(0.5, 0.5, 50) 0.061224489795918366 >>> get_distance(2, 0, 50) 0.0 """ a = x b = y for step in range(max_step): a_new = a * a - b * b + x b = 2 * a * b + y a = a_new # divergence happens for all complex number with an absolute value # greater than 4 if a * a + b * b > 4: break return step / (max_step - 1) def get_black_and_white_rgb(distance: float) -> tuple: """ Black&white color-coding that ignores the relative distance. The Mandelbrot set is black, everything else is white. >>> get_black_and_white_rgb(0) (255, 255, 255) >>> get_black_and_white_rgb(0.5) (255, 255, 255) >>> get_black_and_white_rgb(1) (0, 0, 0) """ if distance == 1: return (0, 0, 0) else: return (255, 255, 255) def get_color_coded_rgb(distance: float) -> tuple: """ Color-coding taking the relative distance into account. The Mandelbrot set is black. >>> get_color_coded_rgb(0) (255, 0, 0) >>> get_color_coded_rgb(0.5) (0, 255, 255) >>> get_color_coded_rgb(1) (0, 0, 0) """ if distance == 1: return (0, 0, 0) else: return tuple(round(i * 255) for i in colorsys.hsv_to_rgb(distance, 1, 1)) def get_image( image_width: int = 800, image_height: int = 600, figure_center_x: float = -0.6, figure_center_y: float = 0, figure_width: float = 3.2, max_step: int = 50, use_distance_color_coding: bool = True, ) -> Image.Image: """ Function to generate the image of the Mandelbrot set. Two types of coordinates are used: image-coordinates that refer to the pixels and figure-coordinates that refer to the complex numbers inside and outside the Mandelbrot set. The figure-coordinates in the arguments of this function determine which section of the Mandelbrot set is viewed. The main area of the Mandelbrot set is roughly between "-1.5 < x < 0.5" and "-1 < y < 1" in the figure-coordinates. Commenting out tests that slow down pytest... # 13.35s call fractals/mandelbrot.py::mandelbrot.get_image # >>> get_image().load()[0,0] (255, 0, 0) # >>> get_image(use_distance_color_coding = False).load()[0,0] (255, 255, 255) """ img = Image.new("RGB", (image_width, image_height)) pixels = img.load() # loop through the image-coordinates for image_x in range(image_width): for image_y in range(image_height): # determine the figure-coordinates based on the image-coordinates figure_height = figure_width / image_width * image_height figure_x = figure_center_x + (image_x / image_width - 0.5) * figure_width figure_y = figure_center_y + (image_y / image_height - 0.5) * figure_height distance = get_distance(figure_x, figure_y, max_step) # color the corresponding pixel based on the selected coloring-function if use_distance_color_coding: pixels[image_x, image_y] = get_color_coded_rgb(distance) else: pixels[image_x, image_y] = get_black_and_white_rgb(distance) return img if __name__ == "__main__": import doctest doctest.testmod() # colored version, full figure img = get_image() # uncomment for colored version, different section, zoomed in # img = get_image(figure_center_x = -0.6, figure_center_y = -0.4, # figure_width = 0.8) # uncomment for black and white version, full figure # img = get_image(use_distance_color_coding = False) # uncomment to save the image # img.save("mandelbrot.png") img.show()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#
#
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Simulate the evolution of a highway with only one road that is a loop. The highway is divided in cells, each cell can have at most one car in it. The highway is a loop so when a car comes to one end, it will come out on the other. Each car is represented by its speed (from 0 to 5). Some information about speed: -1 means that the cell on the highway is empty 0 to 5 are the speed of the cars with 0 being the lowest and 5 the highest highway: list[int] Where every position and speed of every car will be stored probability The probability that a driver will slow down initial_speed The speed of the cars a the start frequency How many cells there are between two cars at the start max_speed The maximum speed a car can go to number_of_cells How many cell are there in the highway number_of_update How many times will the position be updated More information here: https://en.wikipedia.org/wiki/Nagel%E2%80%93Schreckenberg_model Examples for doctest: >>> simulate(construct_highway(6, 3, 0), 2, 0, 2) [[0, -1, -1, 0, -1, -1], [-1, 1, -1, -1, 1, -1], [-1, -1, 1, -1, -1, 1]] >>> simulate(construct_highway(5, 2, -2), 3, 0, 2) [[0, -1, 0, -1, 0], [0, -1, 0, -1, -1], [0, -1, -1, 1, -1], [-1, 1, -1, 0, -1]] """ from random import randint, random def construct_highway( number_of_cells: int, frequency: int, initial_speed: int, random_frequency: bool = False, random_speed: bool = False, max_speed: int = 5, ) -> list: """ Build the highway following the parameters given >>> construct_highway(10, 2, 6) [[6, -1, 6, -1, 6, -1, 6, -1, 6, -1]] >>> construct_highway(10, 10, 2) [[2, -1, -1, -1, -1, -1, -1, -1, -1, -1]] """ highway = [[-1] * number_of_cells] # Create a highway without any car i = 0 if initial_speed < 0: initial_speed = 0 while i < number_of_cells: highway[0][i] = ( randint(0, max_speed) if random_speed else initial_speed ) # Place the cars i += ( randint(1, max_speed * 2) if random_frequency else frequency ) # Arbitrary number, may need tuning return highway def get_distance(highway_now: list, car_index: int) -> int: """ Get the distance between a car (at index car_index) and the next car >>> get_distance([6, -1, 6, -1, 6], 2) 1 >>> get_distance([2, -1, -1, -1, 3, 1, 0, 1, 3, 2], 0) 3 >>> get_distance([-1, -1, -1, -1, 2, -1, -1, -1, 3], -1) 4 """ distance = 0 cells = highway_now[car_index + 1 :] for cell in range(len(cells)): # May need a better name for this if cells[cell] != -1: # If the cell is not empty then return distance # we have the distance we wanted distance += 1 # Here if the car is near the end of the highway return distance + get_distance(highway_now, -1) def update(highway_now: list, probability: float, max_speed: int) -> list: """ Update the speed of the cars >>> update([-1, -1, -1, -1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, -1, -1, -1, 3, -1, -1, -1, -1, 4] >>> update([-1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, 3, -1, -1, -1, -1, 1] """ number_of_cells = len(highway_now) # Beforce calculations, the highway is empty next_highway = [-1] * number_of_cells for car_index in range(number_of_cells): if highway_now[car_index] != -1: # Add 1 to the current speed of the car and cap the speed next_highway[car_index] = min(highway_now[car_index] + 1, max_speed) # Number of empty cell before the next car dn = get_distance(highway_now, car_index) - 1 # We can't have the car causing an accident next_highway[car_index] = min(next_highway[car_index], dn) if random() < probability: # Randomly, a driver will slow down next_highway[car_index] = max(next_highway[car_index] - 1, 0) return next_highway def simulate( highway: list, number_of_update: int, probability: float, max_speed: int ) -> list: """ The main function, it will simulate the evolution of the highway >>> simulate([[-1, 2, -1, -1, -1, 3]], 2, 0.0, 3) [[-1, 2, -1, -1, -1, 3], [-1, -1, -1, 2, -1, 0], [1, -1, -1, 0, -1, -1]] >>> simulate([[-1, 2, -1, 3]], 4, 0.0, 3) [[-1, 2, -1, 3], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0]] """ number_of_cells = len(highway[0]) for i in range(number_of_update): next_speeds_calculated = update(highway[i], probability, max_speed) real_next_speeds = [-1] * number_of_cells for car_index in range(number_of_cells): speed = next_speeds_calculated[car_index] if speed != -1: # Change the position based on the speed (with % to create the loop) index = (car_index + speed) % number_of_cells # Commit the change of position real_next_speeds[index] = speed highway.append(real_next_speeds) return highway if __name__ == "__main__": import doctest doctest.testmod()
""" Simulate the evolution of a highway with only one road that is a loop. The highway is divided in cells, each cell can have at most one car in it. The highway is a loop so when a car comes to one end, it will come out on the other. Each car is represented by its speed (from 0 to 5). Some information about speed: -1 means that the cell on the highway is empty 0 to 5 are the speed of the cars with 0 being the lowest and 5 the highest highway: list[int] Where every position and speed of every car will be stored probability The probability that a driver will slow down initial_speed The speed of the cars a the start frequency How many cells there are between two cars at the start max_speed The maximum speed a car can go to number_of_cells How many cell are there in the highway number_of_update How many times will the position be updated More information here: https://en.wikipedia.org/wiki/Nagel%E2%80%93Schreckenberg_model Examples for doctest: >>> simulate(construct_highway(6, 3, 0), 2, 0, 2) [[0, -1, -1, 0, -1, -1], [-1, 1, -1, -1, 1, -1], [-1, -1, 1, -1, -1, 1]] >>> simulate(construct_highway(5, 2, -2), 3, 0, 2) [[0, -1, 0, -1, 0], [0, -1, 0, -1, -1], [0, -1, -1, 1, -1], [-1, 1, -1, 0, -1]] """ from random import randint, random def construct_highway( number_of_cells: int, frequency: int, initial_speed: int, random_frequency: bool = False, random_speed: bool = False, max_speed: int = 5, ) -> list: """ Build the highway following the parameters given >>> construct_highway(10, 2, 6) [[6, -1, 6, -1, 6, -1, 6, -1, 6, -1]] >>> construct_highway(10, 10, 2) [[2, -1, -1, -1, -1, -1, -1, -1, -1, -1]] """ highway = [[-1] * number_of_cells] # Create a highway without any car i = 0 if initial_speed < 0: initial_speed = 0 while i < number_of_cells: highway[0][i] = ( randint(0, max_speed) if random_speed else initial_speed ) # Place the cars i += ( randint(1, max_speed * 2) if random_frequency else frequency ) # Arbitrary number, may need tuning return highway def get_distance(highway_now: list, car_index: int) -> int: """ Get the distance between a car (at index car_index) and the next car >>> get_distance([6, -1, 6, -1, 6], 2) 1 >>> get_distance([2, -1, -1, -1, 3, 1, 0, 1, 3, 2], 0) 3 >>> get_distance([-1, -1, -1, -1, 2, -1, -1, -1, 3], -1) 4 """ distance = 0 cells = highway_now[car_index + 1 :] for cell in range(len(cells)): # May need a better name for this if cells[cell] != -1: # If the cell is not empty then return distance # we have the distance we wanted distance += 1 # Here if the car is near the end of the highway return distance + get_distance(highway_now, -1) def update(highway_now: list, probability: float, max_speed: int) -> list: """ Update the speed of the cars >>> update([-1, -1, -1, -1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, -1, -1, -1, 3, -1, -1, -1, -1, 4] >>> update([-1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, 3, -1, -1, -1, -1, 1] """ number_of_cells = len(highway_now) # Beforce calculations, the highway is empty next_highway = [-1] * number_of_cells for car_index in range(number_of_cells): if highway_now[car_index] != -1: # Add 1 to the current speed of the car and cap the speed next_highway[car_index] = min(highway_now[car_index] + 1, max_speed) # Number of empty cell before the next car dn = get_distance(highway_now, car_index) - 1 # We can't have the car causing an accident next_highway[car_index] = min(next_highway[car_index], dn) if random() < probability: # Randomly, a driver will slow down next_highway[car_index] = max(next_highway[car_index] - 1, 0) return next_highway def simulate( highway: list, number_of_update: int, probability: float, max_speed: int ) -> list: """ The main function, it will simulate the evolution of the highway >>> simulate([[-1, 2, -1, -1, -1, 3]], 2, 0.0, 3) [[-1, 2, -1, -1, -1, 3], [-1, -1, -1, 2, -1, 0], [1, -1, -1, 0, -1, -1]] >>> simulate([[-1, 2, -1, 3]], 4, 0.0, 3) [[-1, 2, -1, 3], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0]] """ number_of_cells = len(highway[0]) for i in range(number_of_update): next_speeds_calculated = update(highway[i], probability, max_speed) real_next_speeds = [-1] * number_of_cells for car_index in range(number_of_cells): speed = next_speeds_calculated[car_index] if speed != -1: # Change the position based on the speed (with % to create the loop) index = (car_index + speed) % number_of_cells # Commit the change of position real_next_speeds[index] = speed highway.append(real_next_speeds) return highway if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 2: https://projecteuler.net/problem=2 Even Fibonacci Numbers Each new term in the Fibonacci sequence is generated by adding the previous two terms. By starting with 1 and 2, the first 10 terms will be: 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, ... By considering the terms in the Fibonacci sequence whose values do not exceed four million, find the sum of the even-valued terms. References: - https://en.wikipedia.org/wiki/Fibonacci_number """ import math from decimal import Decimal, getcontext def solution(n: int = 4000000) -> int: """ Returns the sum of all even fibonacci sequence elements that are lower or equal to n. >>> solution(10) 10 >>> solution(15) 10 >>> solution(2) 2 >>> solution(1) 0 >>> solution(34) 44 >>> solution(3.4) 2 >>> solution(0) Traceback (most recent call last): ... ValueError: Parameter n must be greater than or equal to one. >>> solution(-17) Traceback (most recent call last): ... ValueError: Parameter n must be greater than or equal to one. >>> solution([]) Traceback (most recent call last): ... TypeError: Parameter n must be int or castable to int. >>> solution("asd") Traceback (most recent call last): ... TypeError: Parameter n must be int or castable to int. """ try: n = int(n) except (TypeError, ValueError): raise TypeError("Parameter n must be int or castable to int.") if n <= 0: raise ValueError("Parameter n must be greater than or equal to one.") getcontext().prec = 100 phi = (Decimal(5) ** Decimal(0.5) + 1) / Decimal(2) index = (math.floor(math.log(n * (phi + 2), phi) - 1) // 3) * 3 + 2 num = Decimal(round(phi ** Decimal(index + 1))) / (phi + 2) total = num // 2 return int(total) if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 2: https://projecteuler.net/problem=2 Even Fibonacci Numbers Each new term in the Fibonacci sequence is generated by adding the previous two terms. By starting with 1 and 2, the first 10 terms will be: 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, ... By considering the terms in the Fibonacci sequence whose values do not exceed four million, find the sum of the even-valued terms. References: - https://en.wikipedia.org/wiki/Fibonacci_number """ import math from decimal import Decimal, getcontext def solution(n: int = 4000000) -> int: """ Returns the sum of all even fibonacci sequence elements that are lower or equal to n. >>> solution(10) 10 >>> solution(15) 10 >>> solution(2) 2 >>> solution(1) 0 >>> solution(34) 44 >>> solution(3.4) 2 >>> solution(0) Traceback (most recent call last): ... ValueError: Parameter n must be greater than or equal to one. >>> solution(-17) Traceback (most recent call last): ... ValueError: Parameter n must be greater than or equal to one. >>> solution([]) Traceback (most recent call last): ... TypeError: Parameter n must be int or castable to int. >>> solution("asd") Traceback (most recent call last): ... TypeError: Parameter n must be int or castable to int. """ try: n = int(n) except (TypeError, ValueError): raise TypeError("Parameter n must be int or castable to int.") if n <= 0: raise ValueError("Parameter n must be greater than or equal to one.") getcontext().prec = 100 phi = (Decimal(5) ** Decimal(0.5) + 1) / Decimal(2) index = (math.floor(math.log(n * (phi + 2), phi) - 1) // 3) * 3 + 2 num = Decimal(round(phi ** Decimal(index + 1))) / (phi + 2) total = num // 2 return int(total) if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
B64_CHARSET = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/" def base64_encode(data: bytes) -> bytes: """Encodes data according to RFC4648. The data is first transformed to binary and appended with binary digits so that its length becomes a multiple of 6, then each 6 binary digits will match a character in the B64_CHARSET string. The number of appended binary digits would later determine how many "=" signs should be added, the padding. For every 2 binary digits added, a "=" sign is added in the output. We can add any binary digits to make it a multiple of 6, for instance, consider the following example: "AA" -> 0010100100101001 -> 001010 010010 1001 As can be seen above, 2 more binary digits should be added, so there's 4 possibilities here: 00, 01, 10 or 11. That being said, Base64 encoding can be used in Steganography to hide data in these appended digits. >>> from base64 import b64encode >>> a = b"This pull request is part of Hacktoberfest20!" >>> b = b"https://tools.ietf.org/html/rfc4648" >>> c = b"A" >>> base64_encode(a) == b64encode(a) True >>> base64_encode(b) == b64encode(b) True >>> base64_encode(c) == b64encode(c) True >>> base64_encode("abc") Traceback (most recent call last): ... TypeError: a bytes-like object is required, not 'str' """ # Make sure the supplied data is a bytes-like object if not isinstance(data, bytes): raise TypeError( f"a bytes-like object is required, not '{data.__class__.__name__}'" ) binary_stream = "".join(bin(byte)[2:].zfill(8) for byte in data) padding_needed = len(binary_stream) % 6 != 0 if padding_needed: # The padding that will be added later padding = b"=" * ((6 - len(binary_stream) % 6) // 2) # Append binary_stream with arbitrary binary digits (0's by default) to make its # length a multiple of 6. binary_stream += "0" * (6 - len(binary_stream) % 6) else: padding = b"" # Encode every 6 binary digits to their corresponding Base64 character return ( "".join( B64_CHARSET[int(binary_stream[index : index + 6], 2)] for index in range(0, len(binary_stream), 6) ).encode() + padding ) def base64_decode(encoded_data: str) -> bytes: """Decodes data according to RFC4648. This does the reverse operation of base64_encode. We first transform the encoded data back to a binary stream, take off the previously appended binary digits according to the padding, at this point we would have a binary stream whose length is multiple of 8, the last step is to convert every 8 bits to a byte. >>> from base64 import b64decode >>> a = "VGhpcyBwdWxsIHJlcXVlc3QgaXMgcGFydCBvZiBIYWNrdG9iZXJmZXN0MjAh" >>> b = "aHR0cHM6Ly90b29scy5pZXRmLm9yZy9odG1sL3JmYzQ2NDg=" >>> c = "QQ==" >>> base64_decode(a) == b64decode(a) True >>> base64_decode(b) == b64decode(b) True >>> base64_decode(c) == b64decode(c) True >>> base64_decode("abc") Traceback (most recent call last): ... AssertionError: Incorrect padding """ # Make sure encoded_data is either a string or a bytes-like object if not isinstance(encoded_data, bytes) and not isinstance(encoded_data, str): raise TypeError( "argument should be a bytes-like object or ASCII string, not " f"'{encoded_data.__class__.__name__}'" ) # In case encoded_data is a bytes-like object, make sure it contains only # ASCII characters so we convert it to a string object if isinstance(encoded_data, bytes): try: encoded_data = encoded_data.decode("utf-8") except UnicodeDecodeError: raise ValueError("base64 encoded data should only contain ASCII characters") padding = encoded_data.count("=") # Check if the encoded string contains non base64 characters if padding: assert all( char in B64_CHARSET for char in encoded_data[:-padding] ), "Invalid base64 character(s) found." else: assert all( char in B64_CHARSET for char in encoded_data ), "Invalid base64 character(s) found." # Check the padding assert len(encoded_data) % 4 == 0 and padding < 3, "Incorrect padding" if padding: # Remove padding if there is one encoded_data = encoded_data[:-padding] binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data )[: -padding * 2] else: binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data ) data = [ int(binary_stream[index : index + 8], 2) for index in range(0, len(binary_stream), 8) ] return bytes(data) if __name__ == "__main__": import doctest doctest.testmod()
B64_CHARSET = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/" def base64_encode(data: bytes) -> bytes: """Encodes data according to RFC4648. The data is first transformed to binary and appended with binary digits so that its length becomes a multiple of 6, then each 6 binary digits will match a character in the B64_CHARSET string. The number of appended binary digits would later determine how many "=" signs should be added, the padding. For every 2 binary digits added, a "=" sign is added in the output. We can add any binary digits to make it a multiple of 6, for instance, consider the following example: "AA" -> 0010100100101001 -> 001010 010010 1001 As can be seen above, 2 more binary digits should be added, so there's 4 possibilities here: 00, 01, 10 or 11. That being said, Base64 encoding can be used in Steganography to hide data in these appended digits. >>> from base64 import b64encode >>> a = b"This pull request is part of Hacktoberfest20!" >>> b = b"https://tools.ietf.org/html/rfc4648" >>> c = b"A" >>> base64_encode(a) == b64encode(a) True >>> base64_encode(b) == b64encode(b) True >>> base64_encode(c) == b64encode(c) True >>> base64_encode("abc") Traceback (most recent call last): ... TypeError: a bytes-like object is required, not 'str' """ # Make sure the supplied data is a bytes-like object if not isinstance(data, bytes): raise TypeError( f"a bytes-like object is required, not '{data.__class__.__name__}'" ) binary_stream = "".join(bin(byte)[2:].zfill(8) for byte in data) padding_needed = len(binary_stream) % 6 != 0 if padding_needed: # The padding that will be added later padding = b"=" * ((6 - len(binary_stream) % 6) // 2) # Append binary_stream with arbitrary binary digits (0's by default) to make its # length a multiple of 6. binary_stream += "0" * (6 - len(binary_stream) % 6) else: padding = b"" # Encode every 6 binary digits to their corresponding Base64 character return ( "".join( B64_CHARSET[int(binary_stream[index : index + 6], 2)] for index in range(0, len(binary_stream), 6) ).encode() + padding ) def base64_decode(encoded_data: str) -> bytes: """Decodes data according to RFC4648. This does the reverse operation of base64_encode. We first transform the encoded data back to a binary stream, take off the previously appended binary digits according to the padding, at this point we would have a binary stream whose length is multiple of 8, the last step is to convert every 8 bits to a byte. >>> from base64 import b64decode >>> a = "VGhpcyBwdWxsIHJlcXVlc3QgaXMgcGFydCBvZiBIYWNrdG9iZXJmZXN0MjAh" >>> b = "aHR0cHM6Ly90b29scy5pZXRmLm9yZy9odG1sL3JmYzQ2NDg=" >>> c = "QQ==" >>> base64_decode(a) == b64decode(a) True >>> base64_decode(b) == b64decode(b) True >>> base64_decode(c) == b64decode(c) True >>> base64_decode("abc") Traceback (most recent call last): ... AssertionError: Incorrect padding """ # Make sure encoded_data is either a string or a bytes-like object if not isinstance(encoded_data, bytes) and not isinstance(encoded_data, str): raise TypeError( "argument should be a bytes-like object or ASCII string, not " f"'{encoded_data.__class__.__name__}'" ) # In case encoded_data is a bytes-like object, make sure it contains only # ASCII characters so we convert it to a string object if isinstance(encoded_data, bytes): try: encoded_data = encoded_data.decode("utf-8") except UnicodeDecodeError: raise ValueError("base64 encoded data should only contain ASCII characters") padding = encoded_data.count("=") # Check if the encoded string contains non base64 characters if padding: assert all( char in B64_CHARSET for char in encoded_data[:-padding] ), "Invalid base64 character(s) found." else: assert all( char in B64_CHARSET for char in encoded_data ), "Invalid base64 character(s) found." # Check the padding assert len(encoded_data) % 4 == 0 and padding < 3, "Incorrect padding" if padding: # Remove padding if there is one encoded_data = encoded_data[:-padding] binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data )[: -padding * 2] else: binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data ) data = [ int(binary_stream[index : index + 8], 2) for index in range(0, len(binary_stream), 8) ] return bytes(data) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 6: https://projecteuler.net/problem=6 Sum square difference The sum of the squares of the first ten natural numbers is, 1^2 + 2^2 + ... + 10^2 = 385 The square of the sum of the first ten natural numbers is, (1 + 2 + ... + 10)^2 = 55^2 = 3025 Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025 - 385 = 2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum. """ def solution(n: int = 100) -> int: """ Returns the difference between the sum of the squares of the first n natural numbers and the square of the sum. >>> solution(10) 2640 >>> solution(15) 13160 >>> solution(20) 41230 >>> solution(50) 1582700 """ sum_cubes = (n * (n + 1) // 2) ** 2 sum_squares = n * (n + 1) * (2 * n + 1) // 6 return sum_cubes - sum_squares if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 6: https://projecteuler.net/problem=6 Sum square difference The sum of the squares of the first ten natural numbers is, 1^2 + 2^2 + ... + 10^2 = 385 The square of the sum of the first ten natural numbers is, (1 + 2 + ... + 10)^2 = 55^2 = 3025 Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025 - 385 = 2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum. """ def solution(n: int = 100) -> int: """ Returns the difference between the sum of the squares of the first n natural numbers and the square of the sum. >>> solution(10) 2640 >>> solution(15) 13160 >>> solution(20) 41230 >>> solution(50) 1582700 """ sum_cubes = (n * (n + 1) // 2) ** 2 sum_squares = n * (n + 1) * (2 * n + 1) // 6 return sum_cubes - sum_squares if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Uses Pythagoras theorem to calculate the distance between two points in space.""" import math class Point: def __init__(self, x, y, z): self.x = x self.y = y self.z = z def __repr__(self) -> str: return f"Point({self.x}, {self.y}, {self.z})" def distance(a: Point, b: Point) -> float: return math.sqrt(abs((b.x - a.x) ** 2 + (b.y - a.y) ** 2 + (b.z - a.z) ** 2)) def test_distance() -> None: """ >>> point1 = Point(2, -1, 7) >>> point2 = Point(1, -3, 5) >>> print(f"Distance from {point1} to {point2} is {distance(point1, point2)}") Distance from Point(2, -1, 7) to Point(1, -3, 5) is 3.0 """ pass if __name__ == "__main__": import doctest doctest.testmod()
"""Uses Pythagoras theorem to calculate the distance between two points in space.""" import math class Point: def __init__(self, x, y, z): self.x = x self.y = y self.z = z def __repr__(self) -> str: return f"Point({self.x}, {self.y}, {self.z})" def distance(a: Point, b: Point) -> float: return math.sqrt(abs((b.x - a.x) ** 2 + (b.y - a.y) ** 2 + (b.z - a.z) ** 2)) def test_distance() -> None: """ >>> point1 = Point(2, -1, 7) >>> point2 = Point(1, -3, 5) >>> print(f"Distance from {point1} to {point2} is {distance(point1, point2)}") Distance from Point(2, -1, 7) to Point(1, -3, 5) is 3.0 """ pass if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Reference: https://en.wikipedia.org/wiki/Gaussian_function """ from numpy import exp, pi, sqrt def gaussian(x, mu: float = 0.0, sigma: float = 1.0) -> int: """ >>> gaussian(1) 0.24197072451914337 >>> gaussian(24) 3.342714441794458e-126 >>> gaussian(1, 4, 2) 0.06475879783294587 >>> gaussian(1, 5, 3) 0.05467002489199788 Supports NumPy Arrays Use numpy.meshgrid with this to generate gaussian blur on images. >>> import numpy as np >>> x = np.arange(15) >>> gaussian(x) array([3.98942280e-01, 2.41970725e-01, 5.39909665e-02, 4.43184841e-03, 1.33830226e-04, 1.48671951e-06, 6.07588285e-09, 9.13472041e-12, 5.05227108e-15, 1.02797736e-18, 7.69459863e-23, 2.11881925e-27, 2.14638374e-32, 7.99882776e-38, 1.09660656e-43]) >>> gaussian(15) 5.530709549844416e-50 >>> gaussian([1,2, 'string']) Traceback (most recent call last): ... TypeError: unsupported operand type(s) for -: 'list' and 'float' >>> gaussian('hello world') Traceback (most recent call last): ... TypeError: unsupported operand type(s) for -: 'str' and 'float' >>> gaussian(10**234) # doctest: +IGNORE_EXCEPTION_DETAIL Traceback (most recent call last): ... OverflowError: (34, 'Result too large') >>> gaussian(10**-326) 0.3989422804014327 >>> gaussian(2523, mu=234234, sigma=3425) 0.0 """ return 1 / sqrt(2 * pi * sigma ** 2) * exp(-((x - mu) ** 2) / (2 * sigma ** 2)) if __name__ == "__main__": import doctest doctest.testmod()
""" Reference: https://en.wikipedia.org/wiki/Gaussian_function """ from numpy import exp, pi, sqrt def gaussian(x, mu: float = 0.0, sigma: float = 1.0) -> int: """ >>> gaussian(1) 0.24197072451914337 >>> gaussian(24) 3.342714441794458e-126 >>> gaussian(1, 4, 2) 0.06475879783294587 >>> gaussian(1, 5, 3) 0.05467002489199788 Supports NumPy Arrays Use numpy.meshgrid with this to generate gaussian blur on images. >>> import numpy as np >>> x = np.arange(15) >>> gaussian(x) array([3.98942280e-01, 2.41970725e-01, 5.39909665e-02, 4.43184841e-03, 1.33830226e-04, 1.48671951e-06, 6.07588285e-09, 9.13472041e-12, 5.05227108e-15, 1.02797736e-18, 7.69459863e-23, 2.11881925e-27, 2.14638374e-32, 7.99882776e-38, 1.09660656e-43]) >>> gaussian(15) 5.530709549844416e-50 >>> gaussian([1,2, 'string']) Traceback (most recent call last): ... TypeError: unsupported operand type(s) for -: 'list' and 'float' >>> gaussian('hello world') Traceback (most recent call last): ... TypeError: unsupported operand type(s) for -: 'str' and 'float' >>> gaussian(10**234) # doctest: +IGNORE_EXCEPTION_DETAIL Traceback (most recent call last): ... OverflowError: (34, 'Result too large') >>> gaussian(10**-326) 0.3989422804014327 >>> gaussian(2523, mu=234234, sigma=3425) 0.0 """ return 1 / sqrt(2 * pi * sigma ** 2) * exp(-((x - mu) ** 2) / (2 * sigma ** 2)) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Problem 31: https://projecteuler.net/problem=31 Coin sums In England the currency is made up of pound, £, and pence, p, and there are eight coins in general circulation: 1p, 2p, 5p, 10p, 20p, 50p, £1 (100p) and £2 (200p). It is possible to make £2 in the following way: 1×£1 + 1×50p + 2×20p + 1×5p + 1×2p + 3×1p How many different ways can £2 be made using any number of coins? Hint: > There are 100 pence in a pound (£1 = 100p) > There are coins(in pence) are available: 1, 2, 5, 10, 20, 50, 100 and 200. > how many different ways you can combine these values to create 200 pence. Example: to make 6p there are 5 ways 1,1,1,1,1,1 1,1,1,1,2 1,1,2,2 2,2,2 1,5 to make 5p there are 4 ways 1,1,1,1,1 1,1,1,2 1,2,2 5 """ def solution(pence: int = 200) -> int: """Returns the number of different ways to make X pence using any number of coins. The solution is based on dynamic programming paradigm in a bottom-up fashion. >>> solution(500) 6295434 >>> solution(200) 73682 >>> solution(50) 451 >>> solution(10) 11 """ coins = [1, 2, 5, 10, 20, 50, 100, 200] number_of_ways = [0] * (pence + 1) number_of_ways[0] = 1 # base case: 1 way to make 0 pence for coin in coins: for i in range(coin, pence + 1, 1): number_of_ways[i] += number_of_ways[i - coin] return number_of_ways[pence] if __name__ == "__main__": assert solution(200) == 73682
""" Problem 31: https://projecteuler.net/problem=31 Coin sums In England the currency is made up of pound, £, and pence, p, and there are eight coins in general circulation: 1p, 2p, 5p, 10p, 20p, 50p, £1 (100p) and £2 (200p). It is possible to make £2 in the following way: 1×£1 + 1×50p + 2×20p + 1×5p + 1×2p + 3×1p How many different ways can £2 be made using any number of coins? Hint: > There are 100 pence in a pound (£1 = 100p) > There are coins(in pence) are available: 1, 2, 5, 10, 20, 50, 100 and 200. > how many different ways you can combine these values to create 200 pence. Example: to make 6p there are 5 ways 1,1,1,1,1,1 1,1,1,1,2 1,1,2,2 2,2,2 1,5 to make 5p there are 4 ways 1,1,1,1,1 1,1,1,2 1,2,2 5 """ def solution(pence: int = 200) -> int: """Returns the number of different ways to make X pence using any number of coins. The solution is based on dynamic programming paradigm in a bottom-up fashion. >>> solution(500) 6295434 >>> solution(200) 73682 >>> solution(50) 451 >>> solution(10) 11 """ coins = [1, 2, 5, 10, 20, 50, 100, 200] number_of_ways = [0] * (pence + 1) number_of_ways[0] = 1 # base case: 1 way to make 0 pence for coin in coins: for i in range(coin, pence + 1, 1): number_of_ways[i] += number_of_ways[i - coin] return number_of_ways[pence] if __name__ == "__main__": assert solution(200) == 73682
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" https://en.wikipedia.org/wiki/Breadth-first_search pseudo-code: breadth_first_search(graph G, start vertex s): // all nodes initially unexplored mark s as explored let Q = queue data structure, initialized with s while Q is non-empty: remove the first node of Q, call it v for each edge(v, w): // for w in graph[v] if w unexplored: mark w as explored add w to Q (at the end) """ from __future__ import annotations G = { "A": ["B", "C"], "B": ["A", "D", "E"], "C": ["A", "F"], "D": ["B"], "E": ["B", "F"], "F": ["C", "E"], } def breadth_first_search(graph: dict, start: str) -> set[str]: """ >>> ''.join(sorted(breadth_first_search(G, 'A'))) 'ABCDEF' """ explored = {start} queue = [start] while queue: v = queue.pop(0) # queue.popleft() for w in graph[v]: if w not in explored: explored.add(w) queue.append(w) return explored if __name__ == "__main__": print(breadth_first_search(G, "A"))
""" https://en.wikipedia.org/wiki/Breadth-first_search pseudo-code: breadth_first_search(graph G, start vertex s): // all nodes initially unexplored mark s as explored let Q = queue data structure, initialized with s while Q is non-empty: remove the first node of Q, call it v for each edge(v, w): // for w in graph[v] if w unexplored: mark w as explored add w to Q (at the end) """ from __future__ import annotations G = { "A": ["B", "C"], "B": ["A", "D", "E"], "C": ["A", "F"], "D": ["B"], "E": ["B", "F"], "F": ["C", "E"], } def breadth_first_search(graph: dict, start: str) -> set[str]: """ >>> ''.join(sorted(breadth_first_search(G, 'A'))) 'ABCDEF' """ explored = {start} queue = [start] while queue: v = queue.pop(0) # queue.popleft() for w in graph[v]: if w not in explored: explored.add(w) queue.append(w) return explored if __name__ == "__main__": print(breadth_first_search(G, "A"))
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" In this problem, we want to determine all possible subsequences of the given sequence. We use backtracking to solve this problem. Time complexity: O(2^n), where n denotes the length of the given sequence. """ from __future__ import annotations from typing import Any def generate_all_subsequences(sequence: list[Any]) -> None: create_state_space_tree(sequence, [], 0) def create_state_space_tree( sequence: list[Any], current_subsequence: list[Any], index: int ) -> None: """ Creates a state space tree to iterate through each branch using DFS. We know that each state has exactly two children. It terminates when it reaches the end of the given sequence. """ if index == len(sequence): print(current_subsequence) return create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.append(sequence[index]) create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.pop() if __name__ == "__main__": seq: list[Any] = [3, 1, 2, 4] generate_all_subsequences(seq) seq.clear() seq.extend(["A", "B", "C"]) generate_all_subsequences(seq)
""" In this problem, we want to determine all possible subsequences of the given sequence. We use backtracking to solve this problem. Time complexity: O(2^n), where n denotes the length of the given sequence. """ from __future__ import annotations from typing import Any def generate_all_subsequences(sequence: list[Any]) -> None: create_state_space_tree(sequence, [], 0) def create_state_space_tree( sequence: list[Any], current_subsequence: list[Any], index: int ) -> None: """ Creates a state space tree to iterate through each branch using DFS. We know that each state has exactly two children. It terminates when it reaches the end of the given sequence. """ if index == len(sequence): print(current_subsequence) return create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.append(sequence[index]) create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.pop() if __name__ == "__main__": seq: list[Any] = [3, 1, 2, 4] generate_all_subsequences(seq) seq.clear() seq.extend(["A", "B", "C"]) generate_all_subsequences(seq)
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" References: https://en.wikipedia.org/wiki/M%C3%B6bius_function References: wikipedia:square free number python/black : True flake8 : True """ from maths.is_square_free import is_square_free from maths.prime_factors import prime_factors def mobius(n: int) -> int: """ Mobius function >>> mobius(24) 0 >>> mobius(-1) 1 >>> mobius('asd') Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'str' >>> mobius(10**400) 0 >>> mobius(10**-400) 1 >>> mobius(-1424) 1 >>> mobius([1, '2', 2.0]) Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'list' """ factors = prime_factors(n) if is_square_free(factors): return -1 if len(factors) % 2 else 1 return 0 if __name__ == "__main__": import doctest doctest.testmod()
""" References: https://en.wikipedia.org/wiki/M%C3%B6bius_function References: wikipedia:square free number python/black : True flake8 : True """ from maths.is_square_free import is_square_free from maths.prime_factors import prime_factors def mobius(n: int) -> int: """ Mobius function >>> mobius(24) 0 >>> mobius(-1) 1 >>> mobius('asd') Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'str' >>> mobius(10**400) 0 >>> mobius(10**-400) 1 >>> mobius(-1424) 1 >>> mobius([1, '2', 2.0]) Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'list' """ factors = prime_factors(n) if is_square_free(factors): return -1 if len(factors) % 2 else 1 return 0 if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Print all subset combinations of n element in given set of r element. def combination_util(arr, n, r, index, data, i): """ Current combination is ready to be printed, print it arr[] ---> Input Array data[] ---> Temporary array to store current combination start & end ---> Staring and Ending indexes in arr[] index ---> Current index in data[] r ---> Size of a combination to be printed """ if index == r: for j in range(r): print(data[j], end=" ") print(" ") return # When no more elements are there to put in data[] if i >= n: return # current is included, put next at next location data[index] = arr[i] combination_util(arr, n, r, index + 1, data, i + 1) # current is excluded, replace it with # next (Note that i+1 is passed, but # index is not changed) combination_util(arr, n, r, index, data, i + 1) # The main function that prints all combinations # of size r in arr[] of size n. This function # mainly uses combinationUtil() def print_combination(arr, n, r): # A temporary array to store all combination one by one data = [0] * r # Print all combination using temporary array 'data[]' combination_util(arr, n, r, 0, data, 0) # Driver function to check for above function arr = [10, 20, 30, 40, 50] print_combination(arr, len(arr), 3) # This code is contributed by Ambuj sahu
# Print all subset combinations of n element in given set of r element. def combination_util(arr, n, r, index, data, i): """ Current combination is ready to be printed, print it arr[] ---> Input Array data[] ---> Temporary array to store current combination start & end ---> Staring and Ending indexes in arr[] index ---> Current index in data[] r ---> Size of a combination to be printed """ if index == r: for j in range(r): print(data[j], end=" ") print(" ") return # When no more elements are there to put in data[] if i >= n: return # current is included, put next at next location data[index] = arr[i] combination_util(arr, n, r, index + 1, data, i + 1) # current is excluded, replace it with # next (Note that i+1 is passed, but # index is not changed) combination_util(arr, n, r, index, data, i + 1) # The main function that prints all combinations # of size r in arr[] of size n. This function # mainly uses combinationUtil() def print_combination(arr, n, r): # A temporary array to store all combination one by one data = [0] * r # Print all combination using temporary array 'data[]' combination_util(arr, n, r, 0, data, 0) # Driver function to check for above function arr = [10, 20, 30, 40, 50] print_combination(arr, len(arr), 3) # This code is contributed by Ambuj sahu
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 35 https://projecteuler.net/problem=35 Problem Statement: The number 197 is called a circular prime because all rotations of the digits: 197, 971, and 719, are themselves prime. There are thirteen such primes below 100: 2, 3, 5, 7, 11, 13, 17, 31, 37, 71, 73, 79, and 97. How many circular primes are there below one million? To solve this problem in an efficient manner, we will first mark all the primes below 1 million using the Seive of Eratosthenes. Then, out of all these primes, we will rule out the numbers which contain an even digit. After this we will generate each circular combination of the number and check if all are prime. """ from __future__ import annotations seive = [True] * 1000001 i = 2 while i * i <= 1000000: if seive[i]: for j in range(i * i, 1000001, i): seive[j] = False i += 1 def is_prime(n: int) -> bool: """ For 2 <= n <= 1000000, return True if n is prime. >>> is_prime(87) False >>> is_prime(23) True >>> is_prime(25363) False """ return seive[n] def contains_an_even_digit(n: int) -> bool: """ Return True if n contains an even digit. >>> contains_an_even_digit(0) True >>> contains_an_even_digit(975317933) False >>> contains_an_even_digit(-245679) True """ return any(digit in "02468" for digit in str(n)) def find_circular_primes(limit: int = 1000000) -> list[int]: """ Return circular primes below limit. >>> len(find_circular_primes(100)) 13 >>> len(find_circular_primes(1000000)) 55 """ result = [2] # result already includes the number 2. for num in range(3, limit + 1, 2): if is_prime(num) and not contains_an_even_digit(num): str_num = str(num) list_nums = [int(str_num[j:] + str_num[:j]) for j in range(len(str_num))] if all(is_prime(i) for i in list_nums): result.append(num) return result def solution() -> int: """ >>> solution() 55 """ return len(find_circular_primes()) if __name__ == "__main__": print(f"{len(find_circular_primes()) = }")
""" Project Euler Problem 35 https://projecteuler.net/problem=35 Problem Statement: The number 197 is called a circular prime because all rotations of the digits: 197, 971, and 719, are themselves prime. There are thirteen such primes below 100: 2, 3, 5, 7, 11, 13, 17, 31, 37, 71, 73, 79, and 97. How many circular primes are there below one million? To solve this problem in an efficient manner, we will first mark all the primes below 1 million using the Seive of Eratosthenes. Then, out of all these primes, we will rule out the numbers which contain an even digit. After this we will generate each circular combination of the number and check if all are prime. """ from __future__ import annotations seive = [True] * 1000001 i = 2 while i * i <= 1000000: if seive[i]: for j in range(i * i, 1000001, i): seive[j] = False i += 1 def is_prime(n: int) -> bool: """ For 2 <= n <= 1000000, return True if n is prime. >>> is_prime(87) False >>> is_prime(23) True >>> is_prime(25363) False """ return seive[n] def contains_an_even_digit(n: int) -> bool: """ Return True if n contains an even digit. >>> contains_an_even_digit(0) True >>> contains_an_even_digit(975317933) False >>> contains_an_even_digit(-245679) True """ return any(digit in "02468" for digit in str(n)) def find_circular_primes(limit: int = 1000000) -> list[int]: """ Return circular primes below limit. >>> len(find_circular_primes(100)) 13 >>> len(find_circular_primes(1000000)) 55 """ result = [2] # result already includes the number 2. for num in range(3, limit + 1, 2): if is_prime(num) and not contains_an_even_digit(num): str_num = str(num) list_nums = [int(str_num[j:] + str_num[:j]) for j in range(len(str_num))] if all(is_prime(i) for i in list_nums): result.append(num) return result def solution() -> int: """ >>> solution() 55 """ return len(find_circular_primes()) if __name__ == "__main__": print(f"{len(find_circular_primes()) = }")
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" References: - http://neuralnetworksanddeeplearning.com/chap2.html (Backpropagation) - https://en.wikipedia.org/wiki/Sigmoid_function (Sigmoid activation function) - https://en.wikipedia.org/wiki/Feedforward_neural_network (Feedforward) """ import numpy class TwoHiddenLayerNeuralNetwork: def __init__(self, input_array: numpy.ndarray, output_array: numpy.ndarray) -> None: """ This function initializes the TwoHiddenLayerNeuralNetwork class with random weights for every layer and initializes predicted output with zeroes. input_array : input values for training the neural network (i.e training data) . output_array : expected output values of the given inputs. """ # Input values provided for training the model. self.input_array = input_array # Random initial weights are assigned where first argument is the # number of nodes in previous layer and second argument is the # number of nodes in the next layer. # Random initial weights are assigned. # self.input_array.shape[1] is used to represent number of nodes in input layer. # First hidden layer consists of 4 nodes. self.input_layer_and_first_hidden_layer_weights = numpy.random.rand( self.input_array.shape[1], 4 ) # Random initial values for the first hidden layer. # First hidden layer has 4 nodes. # Second hidden layer has 3 nodes. self.first_hidden_layer_and_second_hidden_layer_weights = numpy.random.rand( 4, 3 ) # Random initial values for the second hidden layer. # Second hidden layer has 3 nodes. # Output layer has 1 node. self.second_hidden_layer_and_output_layer_weights = numpy.random.rand(3, 1) # Real output values provided. self.output_array = output_array # Predicted output values by the neural network. # Predicted_output array initially consists of zeroes. self.predicted_output = numpy.zeros(output_array.shape) def feedforward(self) -> numpy.ndarray: """ The information moves in only one direction i.e. forward from the input nodes, through the two hidden nodes and to the output nodes. There are no cycles or loops in the network. Return layer_between_second_hidden_layer_and_output (i.e the last layer of the neural network). >>> input_val = numpy.array(([0, 0, 0], [0, 0, 0], [0, 0, 0]), dtype=float) >>> output_val = numpy.array(([0], [0], [0]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> res = nn.feedforward() >>> array_sum = numpy.sum(res) >>> numpy.isnan(array_sum) False """ # Layer_between_input_and_first_hidden_layer is the layer connecting the # input nodes with the first hidden layer nodes. self.layer_between_input_and_first_hidden_layer = sigmoid( numpy.dot(self.input_array, self.input_layer_and_first_hidden_layer_weights) ) # layer_between_first_hidden_layer_and_second_hidden_layer is the layer # connecting the first hidden set of nodes with the second hidden set of nodes. self.layer_between_first_hidden_layer_and_second_hidden_layer = sigmoid( numpy.dot( self.layer_between_input_and_first_hidden_layer, self.first_hidden_layer_and_second_hidden_layer_weights, ) ) # layer_between_second_hidden_layer_and_output is the layer connecting # second hidden layer with the output node. self.layer_between_second_hidden_layer_and_output = sigmoid( numpy.dot( self.layer_between_first_hidden_layer_and_second_hidden_layer, self.second_hidden_layer_and_output_layer_weights, ) ) return self.layer_between_second_hidden_layer_and_output def back_propagation(self) -> None: """ Function for fine-tuning the weights of the neural net based on the error rate obtained in the previous epoch (i.e., iteration). Updation is done using derivative of sogmoid activation function. >>> input_val = numpy.array(([0, 0, 0], [0, 0, 0], [0, 0, 0]), dtype=float) >>> output_val = numpy.array(([0], [0], [0]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> res = nn.feedforward() >>> nn.back_propagation() >>> updated_weights = nn.second_hidden_layer_and_output_layer_weights >>> (res == updated_weights).all() False """ updated_second_hidden_layer_and_output_layer_weights = numpy.dot( self.layer_between_first_hidden_layer_and_second_hidden_layer.T, 2 * (self.output_array - self.predicted_output) * sigmoid_derivative(self.predicted_output), ) updated_first_hidden_layer_and_second_hidden_layer_weights = numpy.dot( self.layer_between_input_and_first_hidden_layer.T, numpy.dot( 2 * (self.output_array - self.predicted_output) * sigmoid_derivative(self.predicted_output), self.second_hidden_layer_and_output_layer_weights.T, ) * sigmoid_derivative( self.layer_between_first_hidden_layer_and_second_hidden_layer ), ) updated_input_layer_and_first_hidden_layer_weights = numpy.dot( self.input_array.T, numpy.dot( numpy.dot( 2 * (self.output_array - self.predicted_output) * sigmoid_derivative(self.predicted_output), self.second_hidden_layer_and_output_layer_weights.T, ) * sigmoid_derivative( self.layer_between_first_hidden_layer_and_second_hidden_layer ), self.first_hidden_layer_and_second_hidden_layer_weights.T, ) * sigmoid_derivative(self.layer_between_input_and_first_hidden_layer), ) self.input_layer_and_first_hidden_layer_weights += ( updated_input_layer_and_first_hidden_layer_weights ) self.first_hidden_layer_and_second_hidden_layer_weights += ( updated_first_hidden_layer_and_second_hidden_layer_weights ) self.second_hidden_layer_and_output_layer_weights += ( updated_second_hidden_layer_and_output_layer_weights ) def train(self, output: numpy.ndarray, iterations: int, give_loss: bool) -> None: """ Performs the feedforwarding and back propagation process for the given number of iterations. Every iteration will update the weights of neural network. output : real output values,required for calculating loss. iterations : number of times the weights are to be updated. give_loss : boolean value, If True then prints loss for each iteration, If False then nothing is printed >>> input_val = numpy.array(([0, 0, 0], [0, 1, 0], [0, 0, 1]), dtype=float) >>> output_val = numpy.array(([0], [1], [1]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> first_iteration_weights = nn.feedforward() >>> nn.back_propagation() >>> updated_weights = nn.second_hidden_layer_and_output_layer_weights >>> (first_iteration_weights == updated_weights).all() False """ for iteration in range(1, iterations + 1): self.output = self.feedforward() self.back_propagation() if give_loss: loss = numpy.mean(numpy.square(output - self.feedforward())) print(f"Iteration {iteration} Loss: {loss}") def predict(self, input: numpy.ndarray) -> int: """ Predict's the output for the given input values using the trained neural network. The output value given by the model ranges in-between 0 and 1. The predict function returns 1 if the model value is greater than the threshold value else returns 0, as the real output values are in binary. >>> input_val = numpy.array(([0, 0, 0], [0, 1, 0], [0, 0, 1]), dtype=float) >>> output_val = numpy.array(([0], [1], [1]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> nn.train(output_val, 1000, False) >>> nn.predict([0,1,0]) in (0, 1) True """ # Input values for which the predictions are to be made. self.array = input self.layer_between_input_and_first_hidden_layer = sigmoid( numpy.dot(self.array, self.input_layer_and_first_hidden_layer_weights) ) self.layer_between_first_hidden_layer_and_second_hidden_layer = sigmoid( numpy.dot( self.layer_between_input_and_first_hidden_layer, self.first_hidden_layer_and_second_hidden_layer_weights, ) ) self.layer_between_second_hidden_layer_and_output = sigmoid( numpy.dot( self.layer_between_first_hidden_layer_and_second_hidden_layer, self.second_hidden_layer_and_output_layer_weights, ) ) return int(self.layer_between_second_hidden_layer_and_output > 0.6) def sigmoid(value: numpy.ndarray) -> numpy.ndarray: """ Applies sigmoid activation function. return normalized values >>> sigmoid(numpy.array(([1, 0, 2], [1, 0, 0]), dtype=numpy.float64)) array([[0.73105858, 0.5 , 0.88079708], [0.73105858, 0.5 , 0.5 ]]) """ return 1 / (1 + numpy.exp(-value)) def sigmoid_derivative(value: numpy.ndarray) -> numpy.ndarray: """ Provides the derivative value of the sigmoid function. returns derivative of the sigmoid value >>> sigmoid_derivative(numpy.array(([1, 0, 2], [1, 0, 0]), dtype=numpy.float64)) array([[ 0., 0., -2.], [ 0., 0., 0.]]) """ return (value) * (1 - (value)) def example() -> int: """ Example for "how to use the neural network class and use the respected methods for the desired output". Calls the TwoHiddenLayerNeuralNetwork class and provides the fixed input output values to the model. Model is trained for a fixed amount of iterations then the predict method is called. In this example the output is divided into 2 classes i.e. binary classification, the two classes are represented by '0' and '1'. >>> example() in (0, 1) True """ # Input values. input = numpy.array( ( [0, 0, 0], [0, 0, 1], [0, 1, 0], [0, 1, 1], [1, 0, 0], [1, 0, 1], [1, 1, 0], [1, 1, 1], ), dtype=numpy.float64, ) # True output values for the given input values. output = numpy.array(([0], [1], [1], [0], [1], [0], [0], [1]), dtype=numpy.float64) # Calling neural network class. neural_network = TwoHiddenLayerNeuralNetwork(input_array=input, output_array=output) # Calling training function. # Set give_loss to True if you want to see loss in every iteration. neural_network.train(output=output, iterations=10, give_loss=False) return neural_network.predict(numpy.array(([1, 1, 1]), dtype=numpy.float64)) if __name__ == "__main__": example()
""" References: - http://neuralnetworksanddeeplearning.com/chap2.html (Backpropagation) - https://en.wikipedia.org/wiki/Sigmoid_function (Sigmoid activation function) - https://en.wikipedia.org/wiki/Feedforward_neural_network (Feedforward) """ import numpy class TwoHiddenLayerNeuralNetwork: def __init__(self, input_array: numpy.ndarray, output_array: numpy.ndarray) -> None: """ This function initializes the TwoHiddenLayerNeuralNetwork class with random weights for every layer and initializes predicted output with zeroes. input_array : input values for training the neural network (i.e training data) . output_array : expected output values of the given inputs. """ # Input values provided for training the model. self.input_array = input_array # Random initial weights are assigned where first argument is the # number of nodes in previous layer and second argument is the # number of nodes in the next layer. # Random initial weights are assigned. # self.input_array.shape[1] is used to represent number of nodes in input layer. # First hidden layer consists of 4 nodes. self.input_layer_and_first_hidden_layer_weights = numpy.random.rand( self.input_array.shape[1], 4 ) # Random initial values for the first hidden layer. # First hidden layer has 4 nodes. # Second hidden layer has 3 nodes. self.first_hidden_layer_and_second_hidden_layer_weights = numpy.random.rand( 4, 3 ) # Random initial values for the second hidden layer. # Second hidden layer has 3 nodes. # Output layer has 1 node. self.second_hidden_layer_and_output_layer_weights = numpy.random.rand(3, 1) # Real output values provided. self.output_array = output_array # Predicted output values by the neural network. # Predicted_output array initially consists of zeroes. self.predicted_output = numpy.zeros(output_array.shape) def feedforward(self) -> numpy.ndarray: """ The information moves in only one direction i.e. forward from the input nodes, through the two hidden nodes and to the output nodes. There are no cycles or loops in the network. Return layer_between_second_hidden_layer_and_output (i.e the last layer of the neural network). >>> input_val = numpy.array(([0, 0, 0], [0, 0, 0], [0, 0, 0]), dtype=float) >>> output_val = numpy.array(([0], [0], [0]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> res = nn.feedforward() >>> array_sum = numpy.sum(res) >>> numpy.isnan(array_sum) False """ # Layer_between_input_and_first_hidden_layer is the layer connecting the # input nodes with the first hidden layer nodes. self.layer_between_input_and_first_hidden_layer = sigmoid( numpy.dot(self.input_array, self.input_layer_and_first_hidden_layer_weights) ) # layer_between_first_hidden_layer_and_second_hidden_layer is the layer # connecting the first hidden set of nodes with the second hidden set of nodes. self.layer_between_first_hidden_layer_and_second_hidden_layer = sigmoid( numpy.dot( self.layer_between_input_and_first_hidden_layer, self.first_hidden_layer_and_second_hidden_layer_weights, ) ) # layer_between_second_hidden_layer_and_output is the layer connecting # second hidden layer with the output node. self.layer_between_second_hidden_layer_and_output = sigmoid( numpy.dot( self.layer_between_first_hidden_layer_and_second_hidden_layer, self.second_hidden_layer_and_output_layer_weights, ) ) return self.layer_between_second_hidden_layer_and_output def back_propagation(self) -> None: """ Function for fine-tuning the weights of the neural net based on the error rate obtained in the previous epoch (i.e., iteration). Updation is done using derivative of sogmoid activation function. >>> input_val = numpy.array(([0, 0, 0], [0, 0, 0], [0, 0, 0]), dtype=float) >>> output_val = numpy.array(([0], [0], [0]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> res = nn.feedforward() >>> nn.back_propagation() >>> updated_weights = nn.second_hidden_layer_and_output_layer_weights >>> (res == updated_weights).all() False """ updated_second_hidden_layer_and_output_layer_weights = numpy.dot( self.layer_between_first_hidden_layer_and_second_hidden_layer.T, 2 * (self.output_array - self.predicted_output) * sigmoid_derivative(self.predicted_output), ) updated_first_hidden_layer_and_second_hidden_layer_weights = numpy.dot( self.layer_between_input_and_first_hidden_layer.T, numpy.dot( 2 * (self.output_array - self.predicted_output) * sigmoid_derivative(self.predicted_output), self.second_hidden_layer_and_output_layer_weights.T, ) * sigmoid_derivative( self.layer_between_first_hidden_layer_and_second_hidden_layer ), ) updated_input_layer_and_first_hidden_layer_weights = numpy.dot( self.input_array.T, numpy.dot( numpy.dot( 2 * (self.output_array - self.predicted_output) * sigmoid_derivative(self.predicted_output), self.second_hidden_layer_and_output_layer_weights.T, ) * sigmoid_derivative( self.layer_between_first_hidden_layer_and_second_hidden_layer ), self.first_hidden_layer_and_second_hidden_layer_weights.T, ) * sigmoid_derivative(self.layer_between_input_and_first_hidden_layer), ) self.input_layer_and_first_hidden_layer_weights += ( updated_input_layer_and_first_hidden_layer_weights ) self.first_hidden_layer_and_second_hidden_layer_weights += ( updated_first_hidden_layer_and_second_hidden_layer_weights ) self.second_hidden_layer_and_output_layer_weights += ( updated_second_hidden_layer_and_output_layer_weights ) def train(self, output: numpy.ndarray, iterations: int, give_loss: bool) -> None: """ Performs the feedforwarding and back propagation process for the given number of iterations. Every iteration will update the weights of neural network. output : real output values,required for calculating loss. iterations : number of times the weights are to be updated. give_loss : boolean value, If True then prints loss for each iteration, If False then nothing is printed >>> input_val = numpy.array(([0, 0, 0], [0, 1, 0], [0, 0, 1]), dtype=float) >>> output_val = numpy.array(([0], [1], [1]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> first_iteration_weights = nn.feedforward() >>> nn.back_propagation() >>> updated_weights = nn.second_hidden_layer_and_output_layer_weights >>> (first_iteration_weights == updated_weights).all() False """ for iteration in range(1, iterations + 1): self.output = self.feedforward() self.back_propagation() if give_loss: loss = numpy.mean(numpy.square(output - self.feedforward())) print(f"Iteration {iteration} Loss: {loss}") def predict(self, input: numpy.ndarray) -> int: """ Predict's the output for the given input values using the trained neural network. The output value given by the model ranges in-between 0 and 1. The predict function returns 1 if the model value is greater than the threshold value else returns 0, as the real output values are in binary. >>> input_val = numpy.array(([0, 0, 0], [0, 1, 0], [0, 0, 1]), dtype=float) >>> output_val = numpy.array(([0], [1], [1]), dtype=float) >>> nn = TwoHiddenLayerNeuralNetwork(input_val, output_val) >>> nn.train(output_val, 1000, False) >>> nn.predict([0,1,0]) in (0, 1) True """ # Input values for which the predictions are to be made. self.array = input self.layer_between_input_and_first_hidden_layer = sigmoid( numpy.dot(self.array, self.input_layer_and_first_hidden_layer_weights) ) self.layer_between_first_hidden_layer_and_second_hidden_layer = sigmoid( numpy.dot( self.layer_between_input_and_first_hidden_layer, self.first_hidden_layer_and_second_hidden_layer_weights, ) ) self.layer_between_second_hidden_layer_and_output = sigmoid( numpy.dot( self.layer_between_first_hidden_layer_and_second_hidden_layer, self.second_hidden_layer_and_output_layer_weights, ) ) return int(self.layer_between_second_hidden_layer_and_output > 0.6) def sigmoid(value: numpy.ndarray) -> numpy.ndarray: """ Applies sigmoid activation function. return normalized values >>> sigmoid(numpy.array(([1, 0, 2], [1, 0, 0]), dtype=numpy.float64)) array([[0.73105858, 0.5 , 0.88079708], [0.73105858, 0.5 , 0.5 ]]) """ return 1 / (1 + numpy.exp(-value)) def sigmoid_derivative(value: numpy.ndarray) -> numpy.ndarray: """ Provides the derivative value of the sigmoid function. returns derivative of the sigmoid value >>> sigmoid_derivative(numpy.array(([1, 0, 2], [1, 0, 0]), dtype=numpy.float64)) array([[ 0., 0., -2.], [ 0., 0., 0.]]) """ return (value) * (1 - (value)) def example() -> int: """ Example for "how to use the neural network class and use the respected methods for the desired output". Calls the TwoHiddenLayerNeuralNetwork class and provides the fixed input output values to the model. Model is trained for a fixed amount of iterations then the predict method is called. In this example the output is divided into 2 classes i.e. binary classification, the two classes are represented by '0' and '1'. >>> example() in (0, 1) True """ # Input values. input = numpy.array( ( [0, 0, 0], [0, 0, 1], [0, 1, 0], [0, 1, 1], [1, 0, 0], [1, 0, 1], [1, 1, 0], [1, 1, 1], ), dtype=numpy.float64, ) # True output values for the given input values. output = numpy.array(([0], [1], [1], [0], [1], [0], [0], [1]), dtype=numpy.float64) # Calling neural network class. neural_network = TwoHiddenLayerNeuralNetwork(input_array=input, output_array=output) # Calling training function. # Set give_loss to True if you want to see loss in every iteration. neural_network.train(output=output, iterations=10, give_loss=False) return neural_network.predict(numpy.array(([1, 1, 1]), dtype=numpy.float64)) if __name__ == "__main__": example()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations from .abs import abs_val def abs_min(x: list[int]) -> int: """ >>> abs_min([0,5,1,11]) 0 >>> abs_min([3,-10,-2]) -2 >>> abs_min([]) Traceback (most recent call last): ... ValueError: abs_min() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_min() arg is an empty sequence") j = x[0] for i in x: if abs_val(i) < abs_val(j): j = i return j def main(): a = [-3, -1, 2, -11] print(abs_min(a)) # = -1 if __name__ == "__main__": import doctest doctest.testmod(verbose=True) main()
from __future__ import annotations from .abs import abs_val def abs_min(x: list[int]) -> int: """ >>> abs_min([0,5,1,11]) 0 >>> abs_min([3,-10,-2]) -2 >>> abs_min([]) Traceback (most recent call last): ... ValueError: abs_min() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_min() arg is an empty sequence") j = x[0] for i in x: if abs_val(i) < abs_val(j): j = i return j def main(): a = [-3, -1, 2, -11] print(abs_min(a)) # = -1 if __name__ == "__main__": import doctest doctest.testmod(verbose=True) main()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 91: https://projecteuler.net/problem=91 The points P (x1, y1) and Q (x2, y2) are plotted at integer coordinates and are joined to the origin, O(0,0), to form ΔOPQ.  There are exactly fourteen triangles containing a right angle that can be formed when each coordinate lies between 0 and 2 inclusive; that is, 0 ≤ x1, y1, x2, y2 ≤ 2.  Given that 0 ≤ x1, y1, x2, y2 ≤ 50, how many right triangles can be formed? """ from itertools import combinations, product def is_right(x1: int, y1: int, x2: int, y2: int) -> bool: """ Check if the triangle described by P(x1,y1), Q(x2,y2) and O(0,0) is right-angled. Note: this doesn't check if P and Q are equal, but that's handled by the use of itertools.combinations in the solution function. >>> is_right(0, 1, 2, 0) True >>> is_right(1, 0, 2, 2) False """ if x1 == y1 == 0 or x2 == y2 == 0: return False a_square = x1 * x1 + y1 * y1 b_square = x2 * x2 + y2 * y2 c_square = (x1 - x2) * (x1 - x2) + (y1 - y2) * (y1 - y2) return ( a_square + b_square == c_square or a_square + c_square == b_square or b_square + c_square == a_square ) def solution(limit: int = 50) -> int: """ Return the number of right triangles OPQ that can be formed by two points P, Q which have both x- and y- coordinates between 0 and limit inclusive. >>> solution(2) 14 >>> solution(10) 448 """ return sum( 1 for pt1, pt2 in combinations(product(range(limit + 1), repeat=2), 2) if is_right(*pt1, *pt2) ) if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 91: https://projecteuler.net/problem=91 The points P (x1, y1) and Q (x2, y2) are plotted at integer coordinates and are joined to the origin, O(0,0), to form ΔOPQ.  There are exactly fourteen triangles containing a right angle that can be formed when each coordinate lies between 0 and 2 inclusive; that is, 0 ≤ x1, y1, x2, y2 ≤ 2.  Given that 0 ≤ x1, y1, x2, y2 ≤ 50, how many right triangles can be formed? """ from itertools import combinations, product def is_right(x1: int, y1: int, x2: int, y2: int) -> bool: """ Check if the triangle described by P(x1,y1), Q(x2,y2) and O(0,0) is right-angled. Note: this doesn't check if P and Q are equal, but that's handled by the use of itertools.combinations in the solution function. >>> is_right(0, 1, 2, 0) True >>> is_right(1, 0, 2, 2) False """ if x1 == y1 == 0 or x2 == y2 == 0: return False a_square = x1 * x1 + y1 * y1 b_square = x2 * x2 + y2 * y2 c_square = (x1 - x2) * (x1 - x2) + (y1 - y2) * (y1 - y2) return ( a_square + b_square == c_square or a_square + c_square == b_square or b_square + c_square == a_square ) def solution(limit: int = 50) -> int: """ Return the number of right triangles OPQ that can be formed by two points P, Q which have both x- and y- coordinates between 0 and limit inclusive. >>> solution(2) 14 >>> solution(10) 448 """ return sum( 1 for pt1, pt2 in combinations(product(range(limit + 1), repeat=2), 2) if is_right(*pt1, *pt2) ) if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def moveTower(height, fromPole, toPole, withPole): """ >>> moveTower(3, 'A', 'B', 'C') moving disk from A to B moving disk from A to C moving disk from B to C moving disk from A to B moving disk from C to A moving disk from C to B moving disk from A to B """ if height >= 1: moveTower(height - 1, fromPole, withPole, toPole) moveDisk(fromPole, toPole) moveTower(height - 1, withPole, toPole, fromPole) def moveDisk(fp, tp): print("moving disk from", fp, "to", tp) def main(): height = int(input("Height of hanoi: ").strip()) moveTower(height, "A", "B", "C") if __name__ == "__main__": main()
def moveTower(height, fromPole, toPole, withPole): """ >>> moveTower(3, 'A', 'B', 'C') moving disk from A to B moving disk from A to C moving disk from B to C moving disk from A to B moving disk from C to A moving disk from C to B moving disk from A to B """ if height >= 1: moveTower(height - 1, fromPole, withPole, toPole) moveDisk(fromPole, toPole) moveTower(height - 1, withPole, toPole, fromPole) def moveDisk(fp, tp): print("moving disk from", fp, "to", tp) def main(): height = int(input("Height of hanoi: ").strip()) moveTower(height, "A", "B", "C") if __name__ == "__main__": main()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Prim's (also known as Jarník's) algorithm is a greedy algorithm that finds a minimum spanning tree for a weighted undirected graph. This means it finds a subset of the edges that forms a tree that includes every vertex, where the total weight of all the edges in the tree is minimized. The algorithm operates by building this tree one vertex at a time, from an arbitrary starting vertex, at each step adding the cheapest possible connection from the tree to another vertex. """ from __future__ import annotations from sys import maxsize from typing import Generic, TypeVar T = TypeVar("T") def get_parent_position(position: int) -> int: """ heap helper function get the position of the parent of the current node >>> get_parent_position(1) 0 >>> get_parent_position(2) 0 """ return (position - 1) // 2 def get_child_left_position(position: int) -> int: """ heap helper function get the position of the left child of the current node >>> get_child_left_position(0) 1 """ return (2 * position) + 1 def get_child_right_position(position: int) -> int: """ heap helper function get the position of the right child of the current node >>> get_child_right_position(0) 2 """ return (2 * position) + 2 class MinPriorityQueue(Generic[T]): """ Minimum Priority Queue Class Functions: is_empty: function to check if the priority queue is empty push: function to add an element with given priority to the queue extract_min: function to remove and return the element with lowest weight (highest priority) update_key: function to update the weight of the given key _bubble_up: helper function to place a node at the proper position (upward movement) _bubble_down: helper function to place a node at the proper position (downward movement) _swap_nodes: helper function to swap the nodes at the given positions >>> queue = MinPriorityQueue() >>> queue.push(1, 1000) >>> queue.push(2, 100) >>> queue.push(3, 4000) >>> queue.push(4, 3000) >>> print(queue.extract_min()) 2 >>> queue.update_key(4, 50) >>> print(queue.extract_min()) 4 >>> print(queue.extract_min()) 1 >>> print(queue.extract_min()) 3 """ def __init__(self) -> None: self.heap: list[tuple[T, int]] = [] self.position_map: dict[T, int] = {} self.elements: int = 0 def __len__(self) -> int: return self.elements def __repr__(self) -> str: return str(self.heap) def is_empty(self) -> bool: # Check if the priority queue is empty return self.elements == 0 def push(self, elem: T, weight: int) -> None: # Add an element with given priority to the queue self.heap.append((elem, weight)) self.position_map[elem] = self.elements self.elements += 1 self._bubble_up(elem) def extract_min(self) -> T: # Remove and return the element with lowest weight (highest priority) if self.elements > 1: self._swap_nodes(0, self.elements - 1) elem, _ = self.heap.pop() del self.position_map[elem] self.elements -= 1 if self.elements > 0: bubble_down_elem, _ = self.heap[0] self._bubble_down(bubble_down_elem) return elem def update_key(self, elem: T, weight: int) -> None: # Update the weight of the given key position = self.position_map[elem] self.heap[position] = (elem, weight) if position > 0: parent_position = get_parent_position(position) _, parent_weight = self.heap[parent_position] if parent_weight > weight: self._bubble_up(elem) else: self._bubble_down(elem) else: self._bubble_down(elem) def _bubble_up(self, elem: T) -> None: # Place a node at the proper position (upward movement) [to be used internally # only] curr_pos = self.position_map[elem] if curr_pos == 0: return parent_position = get_parent_position(curr_pos) _, weight = self.heap[curr_pos] _, parent_weight = self.heap[parent_position] if parent_weight > weight: self._swap_nodes(parent_position, curr_pos) return self._bubble_up(elem) return def _bubble_down(self, elem: T) -> None: # Place a node at the proper position (downward movement) [to be used # internally only] curr_pos = self.position_map[elem] _, weight = self.heap[curr_pos] child_left_position = get_child_left_position(curr_pos) child_right_position = get_child_right_position(curr_pos) if child_left_position < self.elements and child_right_position < self.elements: _, child_left_weight = self.heap[child_left_position] _, child_right_weight = self.heap[child_right_position] if child_right_weight < child_left_weight: if child_right_weight < weight: self._swap_nodes(child_right_position, curr_pos) return self._bubble_down(elem) if child_left_position < self.elements: _, child_left_weight = self.heap[child_left_position] if child_left_weight < weight: self._swap_nodes(child_left_position, curr_pos) return self._bubble_down(elem) else: return if child_right_position < self.elements: _, child_right_weight = self.heap[child_right_position] if child_right_weight < weight: self._swap_nodes(child_right_position, curr_pos) return self._bubble_down(elem) else: return def _swap_nodes(self, node1_pos: int, node2_pos: int) -> None: # Swap the nodes at the given positions node1_elem = self.heap[node1_pos][0] node2_elem = self.heap[node2_pos][0] self.heap[node1_pos], self.heap[node2_pos] = ( self.heap[node2_pos], self.heap[node1_pos], ) self.position_map[node1_elem] = node2_pos self.position_map[node2_elem] = node1_pos class GraphUndirectedWeighted(Generic[T]): """ Graph Undirected Weighted Class Functions: add_node: function to add a node in the graph add_edge: function to add an edge between 2 nodes in the graph """ def __init__(self) -> None: self.connections: dict[T, dict[T, int]] = {} self.nodes: int = 0 def __repr__(self) -> str: return str(self.connections) def __len__(self) -> int: return self.nodes def add_node(self, node: T) -> None: # Add a node in the graph if it is not in the graph if node not in self.connections: self.connections[node] = {} self.nodes += 1 def add_edge(self, node1: T, node2: T, weight: int) -> None: # Add an edge between 2 nodes in the graph self.add_node(node1) self.add_node(node2) self.connections[node1][node2] = weight self.connections[node2][node1] = weight def prims_algo( graph: GraphUndirectedWeighted[T], ) -> tuple[dict[T, int], dict[T, T | None]]: """ >>> graph = GraphUndirectedWeighted() >>> graph.add_edge("a", "b", 3) >>> graph.add_edge("b", "c", 10) >>> graph.add_edge("c", "d", 5) >>> graph.add_edge("a", "c", 15) >>> graph.add_edge("b", "d", 100) >>> dist, parent = prims_algo(graph) >>> abs(dist["a"] - dist["b"]) 3 >>> abs(dist["d"] - dist["b"]) 15 >>> abs(dist["a"] - dist["c"]) 13 """ # prim's algorithm for minimum spanning tree dist: dict[T, int] = {node: maxsize for node in graph.connections} parent: dict[T, T | None] = {node: None for node in graph.connections} priority_queue: MinPriorityQueue[T] = MinPriorityQueue() for node, weight in dist.items(): priority_queue.push(node, weight) if priority_queue.is_empty(): return dist, parent # initialization node = priority_queue.extract_min() dist[node] = 0 for neighbour in graph.connections[node]: if dist[neighbour] > dist[node] + graph.connections[node][neighbour]: dist[neighbour] = dist[node] + graph.connections[node][neighbour] priority_queue.update_key(neighbour, dist[neighbour]) parent[neighbour] = node # running prim's algorithm while not priority_queue.is_empty(): node = priority_queue.extract_min() for neighbour in graph.connections[node]: if dist[neighbour] > dist[node] + graph.connections[node][neighbour]: dist[neighbour] = dist[node] + graph.connections[node][neighbour] priority_queue.update_key(neighbour, dist[neighbour]) parent[neighbour] = node return dist, parent
""" Prim's (also known as Jarník's) algorithm is a greedy algorithm that finds a minimum spanning tree for a weighted undirected graph. This means it finds a subset of the edges that forms a tree that includes every vertex, where the total weight of all the edges in the tree is minimized. The algorithm operates by building this tree one vertex at a time, from an arbitrary starting vertex, at each step adding the cheapest possible connection from the tree to another vertex. """ from __future__ import annotations from sys import maxsize from typing import Generic, TypeVar T = TypeVar("T") def get_parent_position(position: int) -> int: """ heap helper function get the position of the parent of the current node >>> get_parent_position(1) 0 >>> get_parent_position(2) 0 """ return (position - 1) // 2 def get_child_left_position(position: int) -> int: """ heap helper function get the position of the left child of the current node >>> get_child_left_position(0) 1 """ return (2 * position) + 1 def get_child_right_position(position: int) -> int: """ heap helper function get the position of the right child of the current node >>> get_child_right_position(0) 2 """ return (2 * position) + 2 class MinPriorityQueue(Generic[T]): """ Minimum Priority Queue Class Functions: is_empty: function to check if the priority queue is empty push: function to add an element with given priority to the queue extract_min: function to remove and return the element with lowest weight (highest priority) update_key: function to update the weight of the given key _bubble_up: helper function to place a node at the proper position (upward movement) _bubble_down: helper function to place a node at the proper position (downward movement) _swap_nodes: helper function to swap the nodes at the given positions >>> queue = MinPriorityQueue() >>> queue.push(1, 1000) >>> queue.push(2, 100) >>> queue.push(3, 4000) >>> queue.push(4, 3000) >>> print(queue.extract_min()) 2 >>> queue.update_key(4, 50) >>> print(queue.extract_min()) 4 >>> print(queue.extract_min()) 1 >>> print(queue.extract_min()) 3 """ def __init__(self) -> None: self.heap: list[tuple[T, int]] = [] self.position_map: dict[T, int] = {} self.elements: int = 0 def __len__(self) -> int: return self.elements def __repr__(self) -> str: return str(self.heap) def is_empty(self) -> bool: # Check if the priority queue is empty return self.elements == 0 def push(self, elem: T, weight: int) -> None: # Add an element with given priority to the queue self.heap.append((elem, weight)) self.position_map[elem] = self.elements self.elements += 1 self._bubble_up(elem) def extract_min(self) -> T: # Remove and return the element with lowest weight (highest priority) if self.elements > 1: self._swap_nodes(0, self.elements - 1) elem, _ = self.heap.pop() del self.position_map[elem] self.elements -= 1 if self.elements > 0: bubble_down_elem, _ = self.heap[0] self._bubble_down(bubble_down_elem) return elem def update_key(self, elem: T, weight: int) -> None: # Update the weight of the given key position = self.position_map[elem] self.heap[position] = (elem, weight) if position > 0: parent_position = get_parent_position(position) _, parent_weight = self.heap[parent_position] if parent_weight > weight: self._bubble_up(elem) else: self._bubble_down(elem) else: self._bubble_down(elem) def _bubble_up(self, elem: T) -> None: # Place a node at the proper position (upward movement) [to be used internally # only] curr_pos = self.position_map[elem] if curr_pos == 0: return parent_position = get_parent_position(curr_pos) _, weight = self.heap[curr_pos] _, parent_weight = self.heap[parent_position] if parent_weight > weight: self._swap_nodes(parent_position, curr_pos) return self._bubble_up(elem) return def _bubble_down(self, elem: T) -> None: # Place a node at the proper position (downward movement) [to be used # internally only] curr_pos = self.position_map[elem] _, weight = self.heap[curr_pos] child_left_position = get_child_left_position(curr_pos) child_right_position = get_child_right_position(curr_pos) if child_left_position < self.elements and child_right_position < self.elements: _, child_left_weight = self.heap[child_left_position] _, child_right_weight = self.heap[child_right_position] if child_right_weight < child_left_weight: if child_right_weight < weight: self._swap_nodes(child_right_position, curr_pos) return self._bubble_down(elem) if child_left_position < self.elements: _, child_left_weight = self.heap[child_left_position] if child_left_weight < weight: self._swap_nodes(child_left_position, curr_pos) return self._bubble_down(elem) else: return if child_right_position < self.elements: _, child_right_weight = self.heap[child_right_position] if child_right_weight < weight: self._swap_nodes(child_right_position, curr_pos) return self._bubble_down(elem) else: return def _swap_nodes(self, node1_pos: int, node2_pos: int) -> None: # Swap the nodes at the given positions node1_elem = self.heap[node1_pos][0] node2_elem = self.heap[node2_pos][0] self.heap[node1_pos], self.heap[node2_pos] = ( self.heap[node2_pos], self.heap[node1_pos], ) self.position_map[node1_elem] = node2_pos self.position_map[node2_elem] = node1_pos class GraphUndirectedWeighted(Generic[T]): """ Graph Undirected Weighted Class Functions: add_node: function to add a node in the graph add_edge: function to add an edge between 2 nodes in the graph """ def __init__(self) -> None: self.connections: dict[T, dict[T, int]] = {} self.nodes: int = 0 def __repr__(self) -> str: return str(self.connections) def __len__(self) -> int: return self.nodes def add_node(self, node: T) -> None: # Add a node in the graph if it is not in the graph if node not in self.connections: self.connections[node] = {} self.nodes += 1 def add_edge(self, node1: T, node2: T, weight: int) -> None: # Add an edge between 2 nodes in the graph self.add_node(node1) self.add_node(node2) self.connections[node1][node2] = weight self.connections[node2][node1] = weight def prims_algo( graph: GraphUndirectedWeighted[T], ) -> tuple[dict[T, int], dict[T, T | None]]: """ >>> graph = GraphUndirectedWeighted() >>> graph.add_edge("a", "b", 3) >>> graph.add_edge("b", "c", 10) >>> graph.add_edge("c", "d", 5) >>> graph.add_edge("a", "c", 15) >>> graph.add_edge("b", "d", 100) >>> dist, parent = prims_algo(graph) >>> abs(dist["a"] - dist["b"]) 3 >>> abs(dist["d"] - dist["b"]) 15 >>> abs(dist["a"] - dist["c"]) 13 """ # prim's algorithm for minimum spanning tree dist: dict[T, int] = {node: maxsize for node in graph.connections} parent: dict[T, T | None] = {node: None for node in graph.connections} priority_queue: MinPriorityQueue[T] = MinPriorityQueue() for node, weight in dist.items(): priority_queue.push(node, weight) if priority_queue.is_empty(): return dist, parent # initialization node = priority_queue.extract_min() dist[node] = 0 for neighbour in graph.connections[node]: if dist[neighbour] > dist[node] + graph.connections[node][neighbour]: dist[neighbour] = dist[node] + graph.connections[node][neighbour] priority_queue.update_key(neighbour, dist[neighbour]) parent[neighbour] = node # running prim's algorithm while not priority_queue.is_empty(): node = priority_queue.extract_min() for neighbour in graph.connections[node]: if dist[neighbour] > dist[node] + graph.connections[node][neighbour]: dist[neighbour] = dist[node] + graph.connections[node][neighbour] priority_queue.update_key(neighbour, dist[neighbour]) parent[neighbour] = node return dist, parent
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations def find_max(nums: list[int | float]) -> int | float: """ >>> for nums in ([3, 2, 1], [-3, -2, -1], [3, -3, 0], [3.0, 3.1, 2.9]): ... find_max(nums) == max(nums) True True True True >>> find_max([2, 4, 9, 7, 19, 94, 5]) 94 >>> find_max([]) Traceback (most recent call last): ... ValueError: find_max() arg is an empty sequence """ if len(nums) == 0: raise ValueError("find_max() arg is an empty sequence") max_num = nums[0] for x in nums: if x > max_num: max_num = x return max_num if __name__ == "__main__": import doctest doctest.testmod(verbose=True)
from __future__ import annotations def find_max(nums: list[int | float]) -> int | float: """ >>> for nums in ([3, 2, 1], [-3, -2, -1], [3, -3, 0], [3.0, 3.1, 2.9]): ... find_max(nums) == max(nums) True True True True >>> find_max([2, 4, 9, 7, 19, 94, 5]) 94 >>> find_max([]) Traceback (most recent call last): ... ValueError: find_max() arg is an empty sequence """ if len(nums) == 0: raise ValueError("find_max() arg is an empty sequence") max_num = nums[0] for x in nums: if x > max_num: max_num = x return max_num if __name__ == "__main__": import doctest doctest.testmod(verbose=True)
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def bubble_sort(collection): """Pure implementation of bubble sort algorithm in Python :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> bubble_sort([0, 5, 2, 3, 2]) [0, 2, 2, 3, 5] >>> bubble_sort([0, 5, 2, 3, 2]) == sorted([0, 5, 2, 3, 2]) True >>> bubble_sort([]) == sorted([]) True >>> bubble_sort([-2, -45, -5]) == sorted([-2, -45, -5]) True >>> bubble_sort([-23, 0, 6, -4, 34]) == sorted([-23, 0, 6, -4, 34]) True >>> bubble_sort(['d', 'a', 'b', 'e', 'c']) == sorted(['d', 'a', 'b', 'e', 'c']) True >>> import random >>> collection = random.sample(range(-50, 50), 100) >>> bubble_sort(collection) == sorted(collection) True >>> import string >>> collection = random.choices(string.ascii_letters + string.digits, k=100) >>> bubble_sort(collection) == sorted(collection) True """ length = len(collection) for i in range(length - 1): swapped = False for j in range(length - 1 - i): if collection[j] > collection[j + 1]: swapped = True collection[j], collection[j + 1] = collection[j + 1], collection[j] if not swapped: break # Stop iteration if the collection is sorted. return collection if __name__ == "__main__": import doctest import time doctest.testmod() user_input = input("Enter numbers separated by a comma:").strip() unsorted = [int(item) for item in user_input.split(",")] start = time.process_time() print(*bubble_sort(unsorted), sep=",") print(f"Processing time: {time.process_time() - start}")
def bubble_sort(collection): """Pure implementation of bubble sort algorithm in Python :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> bubble_sort([0, 5, 2, 3, 2]) [0, 2, 2, 3, 5] >>> bubble_sort([0, 5, 2, 3, 2]) == sorted([0, 5, 2, 3, 2]) True >>> bubble_sort([]) == sorted([]) True >>> bubble_sort([-2, -45, -5]) == sorted([-2, -45, -5]) True >>> bubble_sort([-23, 0, 6, -4, 34]) == sorted([-23, 0, 6, -4, 34]) True >>> bubble_sort(['d', 'a', 'b', 'e', 'c']) == sorted(['d', 'a', 'b', 'e', 'c']) True >>> import random >>> collection = random.sample(range(-50, 50), 100) >>> bubble_sort(collection) == sorted(collection) True >>> import string >>> collection = random.choices(string.ascii_letters + string.digits, k=100) >>> bubble_sort(collection) == sorted(collection) True """ length = len(collection) for i in range(length - 1): swapped = False for j in range(length - 1 - i): if collection[j] > collection[j + 1]: swapped = True collection[j], collection[j + 1] = collection[j + 1], collection[j] if not swapped: break # Stop iteration if the collection is sorted. return collection if __name__ == "__main__": import doctest import time doctest.testmod() user_input = input("Enter numbers separated by a comma:").strip() unsorted = [int(item) for item in user_input.split(",")] start = time.process_time() print(*bubble_sort(unsorted), sep=",") print(f"Processing time: {time.process_time() - start}")
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Implementation of sequential minimal optimization (SMO) for support vector machines (SVM). Sequential minimal optimization (SMO) is an algorithm for solving the quadratic programming (QP) problem that arises during the training of support vector machines. It was invented by John Platt in 1998. Input: 0: type: numpy.ndarray. 1: first column of ndarray must be tags of samples, must be 1 or -1. 2: rows of ndarray represent samples. Usage: Command: python3 sequential_minimum_optimization.py Code: from sequential_minimum_optimization import SmoSVM, Kernel kernel = Kernel(kernel='poly', degree=3., coef0=1., gamma=0.5) init_alphas = np.zeros(train.shape[0]) SVM = SmoSVM(train=train, alpha_list=init_alphas, kernel_func=kernel, cost=0.4, b=0.0, tolerance=0.001) SVM.fit() predict = SVM.predict(test_samples) Reference: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/smo-book.pdf https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr-98-14.pdf http://web.cs.iastate.edu/~honavar/smo-svm.pdf """ import os import sys import urllib.request import numpy as np import pandas as pd from matplotlib import pyplot as plt from sklearn.datasets import make_blobs, make_circles from sklearn.preprocessing import StandardScaler CANCER_DATASET_URL = ( "http://archive.ics.uci.edu/ml/machine-learning-databases/" "breast-cancer-wisconsin/wdbc.data" ) class SmoSVM: def __init__( self, train, kernel_func, alpha_list=None, cost=0.4, b=0.0, tolerance=0.001, auto_norm=True, ): self._init = True self._auto_norm = auto_norm self._c = np.float64(cost) self._b = np.float64(b) self._tol = np.float64(tolerance) if tolerance > 0.0001 else np.float64(0.001) self.tags = train[:, 0] self.samples = self._norm(train[:, 1:]) if self._auto_norm else train[:, 1:] self.alphas = alpha_list if alpha_list is not None else np.zeros(train.shape[0]) self.Kernel = kernel_func self._eps = 0.001 self._all_samples = list(range(self.length)) self._K_matrix = self._calculate_k_matrix() self._error = np.zeros(self.length) self._unbound = [] self.choose_alpha = self._choose_alphas() # Calculate alphas using SMO algorithm def fit(self): K = self._k state = None while True: # 1: Find alpha1, alpha2 try: i1, i2 = self.choose_alpha.send(state) state = None except StopIteration: print("Optimization done!\nEvery sample satisfy the KKT condition!") break # 2: calculate new alpha2 and new alpha1 y1, y2 = self.tags[i1], self.tags[i2] a1, a2 = self.alphas[i1].copy(), self.alphas[i2].copy() e1, e2 = self._e(i1), self._e(i2) args = (i1, i2, a1, a2, e1, e2, y1, y2) a1_new, a2_new = self._get_new_alpha(*args) if not a1_new and not a2_new: state = False continue self.alphas[i1], self.alphas[i2] = a1_new, a2_new # 3: update threshold(b) b1_new = np.float64( -e1 - y1 * K(i1, i1) * (a1_new - a1) - y2 * K(i2, i1) * (a2_new - a2) + self._b ) b2_new = np.float64( -e2 - y2 * K(i2, i2) * (a2_new - a2) - y1 * K(i1, i2) * (a1_new - a1) + self._b ) if 0.0 < a1_new < self._c: b = b1_new if 0.0 < a2_new < self._c: b = b2_new if not (np.float64(0) < a2_new < self._c) and not ( np.float64(0) < a1_new < self._c ): b = (b1_new + b2_new) / 2.0 b_old = self._b self._b = b # 4: update error value,here we only calculate those non-bound samples' # error self._unbound = [i for i in self._all_samples if self._is_unbound(i)] for s in self.unbound: if s == i1 or s == i2: continue self._error[s] += ( y1 * (a1_new - a1) * K(i1, s) + y2 * (a2_new - a2) * K(i2, s) + (self._b - b_old) ) # if i1 or i2 is non-bound,update there error value to zero if self._is_unbound(i1): self._error[i1] = 0 if self._is_unbound(i2): self._error[i2] = 0 # Predict test samles def predict(self, test_samples, classify=True): if test_samples.shape[1] > self.samples.shape[1]: raise ValueError( "Test samples' feature length does not equal to that of train samples" ) if self._auto_norm: test_samples = self._norm(test_samples) results = [] for test_sample in test_samples: result = self._predict(test_sample) if classify: results.append(1 if result > 0 else -1) else: results.append(result) return np.array(results) # Check if alpha violate KKT condition def _check_obey_kkt(self, index): alphas = self.alphas tol = self._tol r = self._e(index) * self.tags[index] c = self._c return (r < -tol and alphas[index] < c) or (r > tol and alphas[index] > 0.0) # Get value calculated from kernel function def _k(self, i1, i2): # for test samples,use Kernel function if isinstance(i2, np.ndarray): return self.Kernel(self.samples[i1], i2) # for train samples,Kernel values have been saved in matrix else: return self._K_matrix[i1, i2] # Get sample's error def _e(self, index): """ Two cases: 1:Sample[index] is non-bound,Fetch error from list: _error 2:sample[index] is bound,Use predicted value deduct true value: g(xi) - yi """ # get from error data if self._is_unbound(index): return self._error[index] # get by g(xi) - yi else: gx = np.dot(self.alphas * self.tags, self._K_matrix[:, index]) + self._b yi = self.tags[index] return gx - yi # Calculate Kernel matrix of all possible i1,i2 ,saving time def _calculate_k_matrix(self): k_matrix = np.zeros([self.length, self.length]) for i in self._all_samples: for j in self._all_samples: k_matrix[i, j] = np.float64( self.Kernel(self.samples[i, :], self.samples[j, :]) ) return k_matrix # Predict test sample's tag def _predict(self, sample): k = self._k predicted_value = ( np.sum( [ self.alphas[i1] * self.tags[i1] * k(i1, sample) for i1 in self._all_samples ] ) + self._b ) return predicted_value # Choose alpha1 and alpha2 def _choose_alphas(self): locis = yield from self._choose_a1() if not locis: return return locis def _choose_a1(self): """ Choose first alpha ;steps: 1:First loop over all sample 2:Second loop over all non-bound samples till all non-bound samples does not voilate kkt condition. 3:Repeat this two process endlessly,till all samples does not voilate kkt condition samples after first loop. """ while True: all_not_obey = True # all sample print("scanning all sample!") for i1 in [i for i in self._all_samples if self._check_obey_kkt(i)]: all_not_obey = False yield from self._choose_a2(i1) # non-bound sample print("scanning non-bound sample!") while True: not_obey = True for i1 in [ i for i in self._all_samples if self._check_obey_kkt(i) and self._is_unbound(i) ]: not_obey = False yield from self._choose_a2(i1) if not_obey: print("all non-bound samples fit the KKT condition!") break if all_not_obey: print("all samples fit the KKT condition! Optimization done!") break return False def _choose_a2(self, i1): """ Choose the second alpha by using heuristic algorithm ;steps: 1: Choose alpha2 which gets the maximum step size (|E1 - E2|). 2: Start in a random point,loop over all non-bound samples till alpha1 and alpha2 are optimized. 3: Start in a random point,loop over all samples till alpha1 and alpha2 are optimized. """ self._unbound = [i for i in self._all_samples if self._is_unbound(i)] if len(self.unbound) > 0: tmp_error = self._error.copy().tolist() tmp_error_dict = { index: value for index, value in enumerate(tmp_error) if self._is_unbound(index) } if self._e(i1) >= 0: i2 = min(tmp_error_dict, key=lambda index: tmp_error_dict[index]) else: i2 = max(tmp_error_dict, key=lambda index: tmp_error_dict[index]) cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self.unbound, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self._all_samples, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return # Get the new alpha2 and new alpha1 def _get_new_alpha(self, i1, i2, a1, a2, e1, e2, y1, y2): K = self._k if i1 == i2: return None, None # calculate L and H which bound the new alpha2 s = y1 * y2 if s == -1: L, H = max(0.0, a2 - a1), min(self._c, self._c + a2 - a1) else: L, H = max(0.0, a2 + a1 - self._c), min(self._c, a2 + a1) if L == H: return None, None # calculate eta k11 = K(i1, i1) k22 = K(i2, i2) k12 = K(i1, i2) eta = k11 + k22 - 2.0 * k12 # select the new alpha2 which could get the minimal objectives if eta > 0.0: a2_new_unc = a2 + (y2 * (e1 - e2)) / eta # a2_new has a boundary if a2_new_unc >= H: a2_new = H elif a2_new_unc <= L: a2_new = L else: a2_new = a2_new_unc else: b = self._b l1 = a1 + s * (a2 - L) h1 = a1 + s * (a2 - H) # way 1 f1 = y1 * (e1 + b) - a1 * K(i1, i1) - s * a2 * K(i1, i2) f2 = y2 * (e2 + b) - a2 * K(i2, i2) - s * a1 * K(i1, i2) ol = ( l1 * f1 + L * f2 + 1 / 2 * l1 ** 2 * K(i1, i1) + 1 / 2 * L ** 2 * K(i2, i2) + s * L * l1 * K(i1, i2) ) oh = ( h1 * f1 + H * f2 + 1 / 2 * h1 ** 2 * K(i1, i1) + 1 / 2 * H ** 2 * K(i2, i2) + s * H * h1 * K(i1, i2) ) """ # way 2 Use objective function check which alpha2 new could get the minimal objectives """ if ol < (oh - self._eps): a2_new = L elif ol > oh + self._eps: a2_new = H else: a2_new = a2 # a1_new has a boundary too a1_new = a1 + s * (a2 - a2_new) if a1_new < 0: a2_new += s * a1_new a1_new = 0 if a1_new > self._c: a2_new += s * (a1_new - self._c) a1_new = self._c return a1_new, a2_new # Normalise data using min_max way def _norm(self, data): if self._init: self._min = np.min(data, axis=0) self._max = np.max(data, axis=0) self._init = False return (data - self._min) / (self._max - self._min) else: return (data - self._min) / (self._max - self._min) def _is_unbound(self, index): if 0.0 < self.alphas[index] < self._c: return True else: return False def _is_support(self, index): if self.alphas[index] > 0: return True else: return False @property def unbound(self): return self._unbound @property def support(self): return [i for i in range(self.length) if self._is_support(i)] @property def length(self): return self.samples.shape[0] class Kernel: def __init__(self, kernel, degree=1.0, coef0=0.0, gamma=1.0): self.degree = np.float64(degree) self.coef0 = np.float64(coef0) self.gamma = np.float64(gamma) self._kernel_name = kernel self._kernel = self._get_kernel(kernel_name=kernel) self._check() def _polynomial(self, v1, v2): return (self.gamma * np.inner(v1, v2) + self.coef0) ** self.degree def _linear(self, v1, v2): return np.inner(v1, v2) + self.coef0 def _rbf(self, v1, v2): return np.exp(-1 * (self.gamma * np.linalg.norm(v1 - v2) ** 2)) def _check(self): if self._kernel == self._rbf: if self.gamma < 0: raise ValueError("gamma value must greater than 0") def _get_kernel(self, kernel_name): maps = {"linear": self._linear, "poly": self._polynomial, "rbf": self._rbf} return maps[kernel_name] def __call__(self, v1, v2): return self._kernel(v1, v2) def __repr__(self): return self._kernel_name def count_time(func): def call_func(*args, **kwargs): import time start_time = time.time() func(*args, **kwargs) end_time = time.time() print(f"smo algorithm cost {end_time - start_time} seconds") return call_func @count_time def test_cancel_data(): print("Hello!\nStart test svm by smo algorithm!") # 0: download dataset and load into pandas' dataframe if not os.path.exists(r"cancel_data.csv"): request = urllib.request.Request( CANCER_DATASET_URL, headers={"User-Agent": "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)"}, ) response = urllib.request.urlopen(request) content = response.read().decode("utf-8") with open(r"cancel_data.csv", "w") as f: f.write(content) data = pd.read_csv(r"cancel_data.csv", header=None) # 1: pre-processing data del data[data.columns.tolist()[0]] data = data.dropna(axis=0) data = data.replace({"M": np.float64(1), "B": np.float64(-1)}) samples = np.array(data)[:, :] # 2: dividing data into train_data data and test_data data train_data, test_data = samples[:328, :], samples[328:, :] test_tags, test_samples = test_data[:, 0], test_data[:, 1:] # 3: choose kernel function,and set initial alphas to zero(optional) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) al = np.zeros(train_data.shape[0]) # 4: calculating best alphas using SMO algorithm and predict test_data samples mysvm = SmoSVM( train=train_data, kernel_func=mykernel, alpha_list=al, cost=0.4, b=0.0, tolerance=0.001, ) mysvm.fit() predict = mysvm.predict(test_samples) # 5: check accuracy score = 0 test_num = test_tags.shape[0] for i in range(test_tags.shape[0]): if test_tags[i] == predict[i]: score += 1 print(f"\nall: {test_num}\nright: {score}\nfalse: {test_num - score}") print(f"Rough Accuracy: {score / test_tags.shape[0]}") def test_demonstration(): # change stdout print("\nStart plot,please wait!!!") sys.stdout = open(os.devnull, "w") ax1 = plt.subplot2grid((2, 2), (0, 0)) ax2 = plt.subplot2grid((2, 2), (0, 1)) ax3 = plt.subplot2grid((2, 2), (1, 0)) ax4 = plt.subplot2grid((2, 2), (1, 1)) ax1.set_title("linear svm,cost:0.1") test_linear_kernel(ax1, cost=0.1) ax2.set_title("linear svm,cost:500") test_linear_kernel(ax2, cost=500) ax3.set_title("rbf kernel svm,cost:0.1") test_rbf_kernel(ax3, cost=0.1) ax4.set_title("rbf kernel svm,cost:500") test_rbf_kernel(ax4, cost=500) sys.stdout = sys.__stdout__ print("Plot done!!!") def test_linear_kernel(ax, cost): train_x, train_y = make_blobs( n_samples=500, centers=2, n_features=2, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="linear", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def test_rbf_kernel(ax, cost): train_x, train_y = make_circles( n_samples=500, noise=0.1, factor=0.1, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def plot_partition_boundary( model, train_data, ax, resolution=100, colors=("b", "k", "r") ): """ We can not get the optimum w of our kernel svm model which is different from linear svm. For this reason, we generate randomly distributed points with high desity and prediced values of these points are calculated by using our tained model. Then we could use this prediced values to draw contour map. And this contour map can represent svm's partition boundary. """ train_data_x = train_data[:, 1] train_data_y = train_data[:, 2] train_data_tags = train_data[:, 0] xrange = np.linspace(train_data_x.min(), train_data_x.max(), resolution) yrange = np.linspace(train_data_y.min(), train_data_y.max(), resolution) test_samples = np.array([(x, y) for x in xrange for y in yrange]).reshape( resolution * resolution, 2 ) test_tags = model.predict(test_samples, classify=False) grid = test_tags.reshape((len(xrange), len(yrange))) # Plot contour map which represents the partition boundary ax.contour( xrange, yrange, np.mat(grid).T, levels=(-1, 0, 1), linestyles=("--", "-", "--"), linewidths=(1, 1, 1), colors=colors, ) # Plot all train samples ax.scatter( train_data_x, train_data_y, c=train_data_tags, cmap=plt.cm.Dark2, lw=0, alpha=0.5, ) # Plot support vectors support = model.support ax.scatter( train_data_x[support], train_data_y[support], c=train_data_tags[support], cmap=plt.cm.Dark2, ) if __name__ == "__main__": test_cancel_data() test_demonstration() plt.show()
""" Implementation of sequential minimal optimization (SMO) for support vector machines (SVM). Sequential minimal optimization (SMO) is an algorithm for solving the quadratic programming (QP) problem that arises during the training of support vector machines. It was invented by John Platt in 1998. Input: 0: type: numpy.ndarray. 1: first column of ndarray must be tags of samples, must be 1 or -1. 2: rows of ndarray represent samples. Usage: Command: python3 sequential_minimum_optimization.py Code: from sequential_minimum_optimization import SmoSVM, Kernel kernel = Kernel(kernel='poly', degree=3., coef0=1., gamma=0.5) init_alphas = np.zeros(train.shape[0]) SVM = SmoSVM(train=train, alpha_list=init_alphas, kernel_func=kernel, cost=0.4, b=0.0, tolerance=0.001) SVM.fit() predict = SVM.predict(test_samples) Reference: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/smo-book.pdf https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr-98-14.pdf http://web.cs.iastate.edu/~honavar/smo-svm.pdf """ import os import sys import urllib.request import numpy as np import pandas as pd from matplotlib import pyplot as plt from sklearn.datasets import make_blobs, make_circles from sklearn.preprocessing import StandardScaler CANCER_DATASET_URL = ( "http://archive.ics.uci.edu/ml/machine-learning-databases/" "breast-cancer-wisconsin/wdbc.data" ) class SmoSVM: def __init__( self, train, kernel_func, alpha_list=None, cost=0.4, b=0.0, tolerance=0.001, auto_norm=True, ): self._init = True self._auto_norm = auto_norm self._c = np.float64(cost) self._b = np.float64(b) self._tol = np.float64(tolerance) if tolerance > 0.0001 else np.float64(0.001) self.tags = train[:, 0] self.samples = self._norm(train[:, 1:]) if self._auto_norm else train[:, 1:] self.alphas = alpha_list if alpha_list is not None else np.zeros(train.shape[0]) self.Kernel = kernel_func self._eps = 0.001 self._all_samples = list(range(self.length)) self._K_matrix = self._calculate_k_matrix() self._error = np.zeros(self.length) self._unbound = [] self.choose_alpha = self._choose_alphas() # Calculate alphas using SMO algorithm def fit(self): K = self._k state = None while True: # 1: Find alpha1, alpha2 try: i1, i2 = self.choose_alpha.send(state) state = None except StopIteration: print("Optimization done!\nEvery sample satisfy the KKT condition!") break # 2: calculate new alpha2 and new alpha1 y1, y2 = self.tags[i1], self.tags[i2] a1, a2 = self.alphas[i1].copy(), self.alphas[i2].copy() e1, e2 = self._e(i1), self._e(i2) args = (i1, i2, a1, a2, e1, e2, y1, y2) a1_new, a2_new = self._get_new_alpha(*args) if not a1_new and not a2_new: state = False continue self.alphas[i1], self.alphas[i2] = a1_new, a2_new # 3: update threshold(b) b1_new = np.float64( -e1 - y1 * K(i1, i1) * (a1_new - a1) - y2 * K(i2, i1) * (a2_new - a2) + self._b ) b2_new = np.float64( -e2 - y2 * K(i2, i2) * (a2_new - a2) - y1 * K(i1, i2) * (a1_new - a1) + self._b ) if 0.0 < a1_new < self._c: b = b1_new if 0.0 < a2_new < self._c: b = b2_new if not (np.float64(0) < a2_new < self._c) and not ( np.float64(0) < a1_new < self._c ): b = (b1_new + b2_new) / 2.0 b_old = self._b self._b = b # 4: update error value,here we only calculate those non-bound samples' # error self._unbound = [i for i in self._all_samples if self._is_unbound(i)] for s in self.unbound: if s == i1 or s == i2: continue self._error[s] += ( y1 * (a1_new - a1) * K(i1, s) + y2 * (a2_new - a2) * K(i2, s) + (self._b - b_old) ) # if i1 or i2 is non-bound,update there error value to zero if self._is_unbound(i1): self._error[i1] = 0 if self._is_unbound(i2): self._error[i2] = 0 # Predict test samles def predict(self, test_samples, classify=True): if test_samples.shape[1] > self.samples.shape[1]: raise ValueError( "Test samples' feature length does not equal to that of train samples" ) if self._auto_norm: test_samples = self._norm(test_samples) results = [] for test_sample in test_samples: result = self._predict(test_sample) if classify: results.append(1 if result > 0 else -1) else: results.append(result) return np.array(results) # Check if alpha violate KKT condition def _check_obey_kkt(self, index): alphas = self.alphas tol = self._tol r = self._e(index) * self.tags[index] c = self._c return (r < -tol and alphas[index] < c) or (r > tol and alphas[index] > 0.0) # Get value calculated from kernel function def _k(self, i1, i2): # for test samples,use Kernel function if isinstance(i2, np.ndarray): return self.Kernel(self.samples[i1], i2) # for train samples,Kernel values have been saved in matrix else: return self._K_matrix[i1, i2] # Get sample's error def _e(self, index): """ Two cases: 1:Sample[index] is non-bound,Fetch error from list: _error 2:sample[index] is bound,Use predicted value deduct true value: g(xi) - yi """ # get from error data if self._is_unbound(index): return self._error[index] # get by g(xi) - yi else: gx = np.dot(self.alphas * self.tags, self._K_matrix[:, index]) + self._b yi = self.tags[index] return gx - yi # Calculate Kernel matrix of all possible i1,i2 ,saving time def _calculate_k_matrix(self): k_matrix = np.zeros([self.length, self.length]) for i in self._all_samples: for j in self._all_samples: k_matrix[i, j] = np.float64( self.Kernel(self.samples[i, :], self.samples[j, :]) ) return k_matrix # Predict test sample's tag def _predict(self, sample): k = self._k predicted_value = ( np.sum( [ self.alphas[i1] * self.tags[i1] * k(i1, sample) for i1 in self._all_samples ] ) + self._b ) return predicted_value # Choose alpha1 and alpha2 def _choose_alphas(self): locis = yield from self._choose_a1() if not locis: return return locis def _choose_a1(self): """ Choose first alpha ;steps: 1:First loop over all sample 2:Second loop over all non-bound samples till all non-bound samples does not voilate kkt condition. 3:Repeat this two process endlessly,till all samples does not voilate kkt condition samples after first loop. """ while True: all_not_obey = True # all sample print("scanning all sample!") for i1 in [i for i in self._all_samples if self._check_obey_kkt(i)]: all_not_obey = False yield from self._choose_a2(i1) # non-bound sample print("scanning non-bound sample!") while True: not_obey = True for i1 in [ i for i in self._all_samples if self._check_obey_kkt(i) and self._is_unbound(i) ]: not_obey = False yield from self._choose_a2(i1) if not_obey: print("all non-bound samples fit the KKT condition!") break if all_not_obey: print("all samples fit the KKT condition! Optimization done!") break return False def _choose_a2(self, i1): """ Choose the second alpha by using heuristic algorithm ;steps: 1: Choose alpha2 which gets the maximum step size (|E1 - E2|). 2: Start in a random point,loop over all non-bound samples till alpha1 and alpha2 are optimized. 3: Start in a random point,loop over all samples till alpha1 and alpha2 are optimized. """ self._unbound = [i for i in self._all_samples if self._is_unbound(i)] if len(self.unbound) > 0: tmp_error = self._error.copy().tolist() tmp_error_dict = { index: value for index, value in enumerate(tmp_error) if self._is_unbound(index) } if self._e(i1) >= 0: i2 = min(tmp_error_dict, key=lambda index: tmp_error_dict[index]) else: i2 = max(tmp_error_dict, key=lambda index: tmp_error_dict[index]) cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self.unbound, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self._all_samples, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return # Get the new alpha2 and new alpha1 def _get_new_alpha(self, i1, i2, a1, a2, e1, e2, y1, y2): K = self._k if i1 == i2: return None, None # calculate L and H which bound the new alpha2 s = y1 * y2 if s == -1: L, H = max(0.0, a2 - a1), min(self._c, self._c + a2 - a1) else: L, H = max(0.0, a2 + a1 - self._c), min(self._c, a2 + a1) if L == H: return None, None # calculate eta k11 = K(i1, i1) k22 = K(i2, i2) k12 = K(i1, i2) eta = k11 + k22 - 2.0 * k12 # select the new alpha2 which could get the minimal objectives if eta > 0.0: a2_new_unc = a2 + (y2 * (e1 - e2)) / eta # a2_new has a boundary if a2_new_unc >= H: a2_new = H elif a2_new_unc <= L: a2_new = L else: a2_new = a2_new_unc else: b = self._b l1 = a1 + s * (a2 - L) h1 = a1 + s * (a2 - H) # way 1 f1 = y1 * (e1 + b) - a1 * K(i1, i1) - s * a2 * K(i1, i2) f2 = y2 * (e2 + b) - a2 * K(i2, i2) - s * a1 * K(i1, i2) ol = ( l1 * f1 + L * f2 + 1 / 2 * l1 ** 2 * K(i1, i1) + 1 / 2 * L ** 2 * K(i2, i2) + s * L * l1 * K(i1, i2) ) oh = ( h1 * f1 + H * f2 + 1 / 2 * h1 ** 2 * K(i1, i1) + 1 / 2 * H ** 2 * K(i2, i2) + s * H * h1 * K(i1, i2) ) """ # way 2 Use objective function check which alpha2 new could get the minimal objectives """ if ol < (oh - self._eps): a2_new = L elif ol > oh + self._eps: a2_new = H else: a2_new = a2 # a1_new has a boundary too a1_new = a1 + s * (a2 - a2_new) if a1_new < 0: a2_new += s * a1_new a1_new = 0 if a1_new > self._c: a2_new += s * (a1_new - self._c) a1_new = self._c return a1_new, a2_new # Normalise data using min_max way def _norm(self, data): if self._init: self._min = np.min(data, axis=0) self._max = np.max(data, axis=0) self._init = False return (data - self._min) / (self._max - self._min) else: return (data - self._min) / (self._max - self._min) def _is_unbound(self, index): if 0.0 < self.alphas[index] < self._c: return True else: return False def _is_support(self, index): if self.alphas[index] > 0: return True else: return False @property def unbound(self): return self._unbound @property def support(self): return [i for i in range(self.length) if self._is_support(i)] @property def length(self): return self.samples.shape[0] class Kernel: def __init__(self, kernel, degree=1.0, coef0=0.0, gamma=1.0): self.degree = np.float64(degree) self.coef0 = np.float64(coef0) self.gamma = np.float64(gamma) self._kernel_name = kernel self._kernel = self._get_kernel(kernel_name=kernel) self._check() def _polynomial(self, v1, v2): return (self.gamma * np.inner(v1, v2) + self.coef0) ** self.degree def _linear(self, v1, v2): return np.inner(v1, v2) + self.coef0 def _rbf(self, v1, v2): return np.exp(-1 * (self.gamma * np.linalg.norm(v1 - v2) ** 2)) def _check(self): if self._kernel == self._rbf: if self.gamma < 0: raise ValueError("gamma value must greater than 0") def _get_kernel(self, kernel_name): maps = {"linear": self._linear, "poly": self._polynomial, "rbf": self._rbf} return maps[kernel_name] def __call__(self, v1, v2): return self._kernel(v1, v2) def __repr__(self): return self._kernel_name def count_time(func): def call_func(*args, **kwargs): import time start_time = time.time() func(*args, **kwargs) end_time = time.time() print(f"smo algorithm cost {end_time - start_time} seconds") return call_func @count_time def test_cancel_data(): print("Hello!\nStart test svm by smo algorithm!") # 0: download dataset and load into pandas' dataframe if not os.path.exists(r"cancel_data.csv"): request = urllib.request.Request( CANCER_DATASET_URL, headers={"User-Agent": "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)"}, ) response = urllib.request.urlopen(request) content = response.read().decode("utf-8") with open(r"cancel_data.csv", "w") as f: f.write(content) data = pd.read_csv(r"cancel_data.csv", header=None) # 1: pre-processing data del data[data.columns.tolist()[0]] data = data.dropna(axis=0) data = data.replace({"M": np.float64(1), "B": np.float64(-1)}) samples = np.array(data)[:, :] # 2: dividing data into train_data data and test_data data train_data, test_data = samples[:328, :], samples[328:, :] test_tags, test_samples = test_data[:, 0], test_data[:, 1:] # 3: choose kernel function,and set initial alphas to zero(optional) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) al = np.zeros(train_data.shape[0]) # 4: calculating best alphas using SMO algorithm and predict test_data samples mysvm = SmoSVM( train=train_data, kernel_func=mykernel, alpha_list=al, cost=0.4, b=0.0, tolerance=0.001, ) mysvm.fit() predict = mysvm.predict(test_samples) # 5: check accuracy score = 0 test_num = test_tags.shape[0] for i in range(test_tags.shape[0]): if test_tags[i] == predict[i]: score += 1 print(f"\nall: {test_num}\nright: {score}\nfalse: {test_num - score}") print(f"Rough Accuracy: {score / test_tags.shape[0]}") def test_demonstration(): # change stdout print("\nStart plot,please wait!!!") sys.stdout = open(os.devnull, "w") ax1 = plt.subplot2grid((2, 2), (0, 0)) ax2 = plt.subplot2grid((2, 2), (0, 1)) ax3 = plt.subplot2grid((2, 2), (1, 0)) ax4 = plt.subplot2grid((2, 2), (1, 1)) ax1.set_title("linear svm,cost:0.1") test_linear_kernel(ax1, cost=0.1) ax2.set_title("linear svm,cost:500") test_linear_kernel(ax2, cost=500) ax3.set_title("rbf kernel svm,cost:0.1") test_rbf_kernel(ax3, cost=0.1) ax4.set_title("rbf kernel svm,cost:500") test_rbf_kernel(ax4, cost=500) sys.stdout = sys.__stdout__ print("Plot done!!!") def test_linear_kernel(ax, cost): train_x, train_y = make_blobs( n_samples=500, centers=2, n_features=2, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="linear", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def test_rbf_kernel(ax, cost): train_x, train_y = make_circles( n_samples=500, noise=0.1, factor=0.1, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def plot_partition_boundary( model, train_data, ax, resolution=100, colors=("b", "k", "r") ): """ We can not get the optimum w of our kernel svm model which is different from linear svm. For this reason, we generate randomly distributed points with high desity and prediced values of these points are calculated by using our tained model. Then we could use this prediced values to draw contour map. And this contour map can represent svm's partition boundary. """ train_data_x = train_data[:, 1] train_data_y = train_data[:, 2] train_data_tags = train_data[:, 0] xrange = np.linspace(train_data_x.min(), train_data_x.max(), resolution) yrange = np.linspace(train_data_y.min(), train_data_y.max(), resolution) test_samples = np.array([(x, y) for x in xrange for y in yrange]).reshape( resolution * resolution, 2 ) test_tags = model.predict(test_samples, classify=False) grid = test_tags.reshape((len(xrange), len(yrange))) # Plot contour map which represents the partition boundary ax.contour( xrange, yrange, np.mat(grid).T, levels=(-1, 0, 1), linestyles=("--", "-", "--"), linewidths=(1, 1, 1), colors=colors, ) # Plot all train samples ax.scatter( train_data_x, train_data_y, c=train_data_tags, cmap=plt.cm.Dark2, lw=0, alpha=0.5, ) # Plot support vectors support = model.support ax.scatter( train_data_x[support], train_data_y[support], c=train_data_tags[support], cmap=plt.cm.Dark2, ) if __name__ == "__main__": test_cancel_data() test_demonstration() plt.show()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from sklearn import svm from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split # different functions implementing different types of SVM's def NuSVC(train_x, train_y): svc_NuSVC = svm.NuSVC() svc_NuSVC.fit(train_x, train_y) return svc_NuSVC def Linearsvc(train_x, train_y): svc_linear = svm.LinearSVC(tol=10e-2) svc_linear.fit(train_x, train_y) return svc_linear def SVC(train_x, train_y): # svm.SVC(C=1.0, kernel='rbf', degree=3, gamma=0.0, coef0=0.0, shrinking=True, # probability=False,tol=0.001, cache_size=200, class_weight=None, verbose=False, # max_iter=1000, random_state=None) # various parameters like "kernel","gamma","C" can effectively tuned for a given # machine learning model. SVC = svm.SVC(gamma="auto") SVC.fit(train_x, train_y) return SVC def test(X_new): """ 3 test cases to be passed an array containing the sepal length (cm), sepal width (cm), petal length (cm), petal width (cm) based on which the target name will be predicted >>> test([1,2,1,4]) 'virginica' >>> test([5, 2, 4, 1]) 'versicolor' >>> test([6,3,4,1]) 'versicolor' """ iris = load_iris() # splitting the dataset to test and train train_x, test_x, train_y, test_y = train_test_split( iris["data"], iris["target"], random_state=4 ) # any of the 3 types of SVM can be used # current_model=SVC(train_x, train_y) # current_model=NuSVC(train_x, train_y) current_model = Linearsvc(train_x, train_y) prediction = current_model.predict([X_new]) return iris["target_names"][prediction][0] if __name__ == "__main__": import doctest doctest.testmod()
from sklearn import svm from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split # different functions implementing different types of SVM's def NuSVC(train_x, train_y): svc_NuSVC = svm.NuSVC() svc_NuSVC.fit(train_x, train_y) return svc_NuSVC def Linearsvc(train_x, train_y): svc_linear = svm.LinearSVC(tol=10e-2) svc_linear.fit(train_x, train_y) return svc_linear def SVC(train_x, train_y): # svm.SVC(C=1.0, kernel='rbf', degree=3, gamma=0.0, coef0=0.0, shrinking=True, # probability=False,tol=0.001, cache_size=200, class_weight=None, verbose=False, # max_iter=1000, random_state=None) # various parameters like "kernel","gamma","C" can effectively tuned for a given # machine learning model. SVC = svm.SVC(gamma="auto") SVC.fit(train_x, train_y) return SVC def test(X_new): """ 3 test cases to be passed an array containing the sepal length (cm), sepal width (cm), petal length (cm), petal width (cm) based on which the target name will be predicted >>> test([1,2,1,4]) 'virginica' >>> test([5, 2, 4, 1]) 'versicolor' >>> test([6,3,4,1]) 'versicolor' """ iris = load_iris() # splitting the dataset to test and train train_x, test_x, train_y, test_y = train_test_split( iris["data"], iris["target"], random_state=4 ) # any of the 3 types of SVM can be used # current_model=SVC(train_x, train_y) # current_model=NuSVC(train_x, train_y) current_model = Linearsvc(train_x, train_y) prediction = current_model.predict([X_new]) return iris["target_names"][prediction][0] if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
name: "build" on: pull_request: schedule: - cron: "0 0 * * *" # Run everyday jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-python@v2 with: python-version: "3.9" - uses: actions/cache@v2 with: path: ~/.cache/pip key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }} - name: Install dependencies run: | python -m pip install --upgrade pip setuptools six wheel python -m pip install mypy pytest-cov -r requirements.txt - run: mypy --install-types --non-interactive . - name: Run tests run: pytest --doctest-modules --ignore=project_euler/ --ignore=scripts/validate_solutions.py --cov-report=term-missing:skip-covered --cov=. . - if: ${{ success() }} run: scripts/build_directory_md.py 2>&1 | tee DIRECTORY.md
name: "build" on: pull_request: schedule: - cron: "0 0 * * *" # Run everyday jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-python@v2 with: python-version: "3.9" - uses: actions/cache@v2 with: path: ~/.cache/pip key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }} - name: Install dependencies run: | python -m pip install --upgrade pip setuptools six wheel python -m pip install mypy pytest-cov -r requirements.txt - run: mypy --install-types --non-interactive . - name: Run tests run: pytest --doctest-modules --ignore=project_euler/ --ignore=scripts/validate_solutions.py --cov-report=term-missing:skip-covered --cov=. . - if: ${{ success() }} run: scripts/build_directory_md.py 2>&1 | tee DIRECTORY.md
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
class things: def __init__(self, name, value, weight): self.name = name self.value = value self.weight = weight def __repr__(self): return f"{self.__class__.__name__}({self.name}, {self.value}, {self.weight})" def get_value(self): return self.value def get_name(self): return self.name def get_weight(self): return self.weight def value_Weight(self): return self.value / self.weight def build_menu(name, value, weight): menu = [] for i in range(len(value)): menu.append(things(name[i], value[i], weight[i])) return menu def greedy(item, maxCost, keyFunc): itemsCopy = sorted(item, key=keyFunc, reverse=True) result = [] totalValue, total_cost = 0.0, 0.0 for i in range(len(itemsCopy)): if (total_cost + itemsCopy[i].get_weight()) <= maxCost: result.append(itemsCopy[i]) total_cost += itemsCopy[i].get_weight() totalValue += itemsCopy[i].get_value() return (result, totalValue) def test_greedy(): """ >>> food = ["Burger", "Pizza", "Coca Cola", "Rice", ... "Sambhar", "Chicken", "Fries", "Milk"] >>> value = [80, 100, 60, 70, 50, 110, 90, 60] >>> weight = [40, 60, 40, 70, 100, 85, 55, 70] >>> foods = build_menu(food, value, weight) >>> foods # doctest: +NORMALIZE_WHITESPACE [things(Burger, 80, 40), things(Pizza, 100, 60), things(Coca Cola, 60, 40), things(Rice, 70, 70), things(Sambhar, 50, 100), things(Chicken, 110, 85), things(Fries, 90, 55), things(Milk, 60, 70)] >>> greedy(foods, 500, things.get_value) # doctest: +NORMALIZE_WHITESPACE ([things(Chicken, 110, 85), things(Pizza, 100, 60), things(Fries, 90, 55), things(Burger, 80, 40), things(Rice, 70, 70), things(Coca Cola, 60, 40), things(Milk, 60, 70)], 570.0) """ if __name__ == "__main__": import doctest doctest.testmod()
class things: def __init__(self, name, value, weight): self.name = name self.value = value self.weight = weight def __repr__(self): return f"{self.__class__.__name__}({self.name}, {self.value}, {self.weight})" def get_value(self): return self.value def get_name(self): return self.name def get_weight(self): return self.weight def value_Weight(self): return self.value / self.weight def build_menu(name, value, weight): menu = [] for i in range(len(value)): menu.append(things(name[i], value[i], weight[i])) return menu def greedy(item, maxCost, keyFunc): itemsCopy = sorted(item, key=keyFunc, reverse=True) result = [] totalValue, total_cost = 0.0, 0.0 for i in range(len(itemsCopy)): if (total_cost + itemsCopy[i].get_weight()) <= maxCost: result.append(itemsCopy[i]) total_cost += itemsCopy[i].get_weight() totalValue += itemsCopy[i].get_value() return (result, totalValue) def test_greedy(): """ >>> food = ["Burger", "Pizza", "Coca Cola", "Rice", ... "Sambhar", "Chicken", "Fries", "Milk"] >>> value = [80, 100, 60, 70, 50, 110, 90, 60] >>> weight = [40, 60, 40, 70, 100, 85, 55, 70] >>> foods = build_menu(food, value, weight) >>> foods # doctest: +NORMALIZE_WHITESPACE [things(Burger, 80, 40), things(Pizza, 100, 60), things(Coca Cola, 60, 40), things(Rice, 70, 70), things(Sambhar, 50, 100), things(Chicken, 110, 85), things(Fries, 90, 55), things(Milk, 60, 70)] >>> greedy(foods, 500, things.get_value) # doctest: +NORMALIZE_WHITESPACE ([things(Chicken, 110, 85), things(Pizza, 100, 60), things(Fries, 90, 55), things(Burger, 80, 40), things(Rice, 70, 70), things(Coca Cola, 60, 40), things(Milk, 60, 70)], 570.0) """ if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
hex_table = {hex(i)[2:]: i for i in range(16)} # Use [:2] to strip off the leading '0x' def hex_to_decimal(hex_string: str) -> int: """ Convert a hexadecimal value to its decimal equivalent #https://www.programiz.com/python-programming/methods/built-in/hex >>> hex_to_decimal("a") 10 >>> hex_to_decimal("12f") 303 >>> hex_to_decimal(" 12f ") 303 >>> hex_to_decimal("FfFf") 65535 >>> hex_to_decimal("-Ff") -255 >>> hex_to_decimal("F-f") Traceback (most recent call last): ... ValueError: Non-hexadecimal value was passed to the function >>> hex_to_decimal("") Traceback (most recent call last): ... ValueError: Empty string was passed to the function >>> hex_to_decimal("12m") Traceback (most recent call last): ... ValueError: Non-hexadecimal value was passed to the function """ hex_string = hex_string.strip().lower() if not hex_string: raise ValueError("Empty string was passed to the function") is_negative = hex_string[0] == "-" if is_negative: hex_string = hex_string[1:] if not all(char in hex_table for char in hex_string): raise ValueError("Non-hexadecimal value was passed to the function") decimal_number = 0 for char in hex_string: decimal_number = 16 * decimal_number + hex_table[char] return -decimal_number if is_negative else decimal_number if __name__ == "__main__": from doctest import testmod testmod()
hex_table = {hex(i)[2:]: i for i in range(16)} # Use [:2] to strip off the leading '0x' def hex_to_decimal(hex_string: str) -> int: """ Convert a hexadecimal value to its decimal equivalent #https://www.programiz.com/python-programming/methods/built-in/hex >>> hex_to_decimal("a") 10 >>> hex_to_decimal("12f") 303 >>> hex_to_decimal(" 12f ") 303 >>> hex_to_decimal("FfFf") 65535 >>> hex_to_decimal("-Ff") -255 >>> hex_to_decimal("F-f") Traceback (most recent call last): ... ValueError: Non-hexadecimal value was passed to the function >>> hex_to_decimal("") Traceback (most recent call last): ... ValueError: Empty string was passed to the function >>> hex_to_decimal("12m") Traceback (most recent call last): ... ValueError: Non-hexadecimal value was passed to the function """ hex_string = hex_string.strip().lower() if not hex_string: raise ValueError("Empty string was passed to the function") is_negative = hex_string[0] == "-" if is_negative: hex_string = hex_string[1:] if not all(char in hex_table for char in hex_string): raise ValueError("Non-hexadecimal value was passed to the function") decimal_number = 0 for char in hex_string: decimal_number = 16 * decimal_number + hex_table[char] return -decimal_number if is_negative else decimal_number if __name__ == "__main__": from doctest import testmod testmod()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" PyTest's for Digital Image Processing """ from cv2 import COLOR_BGR2GRAY, cvtColor, imread from numpy import array, uint8 from PIL import Image from digital_image_processing import change_contrast as cc from digital_image_processing import convert_to_negative as cn from digital_image_processing import sepia as sp from digital_image_processing.dithering import burkes as bs from digital_image_processing.edge_detection import canny as canny from digital_image_processing.filters import convolve as conv from digital_image_processing.filters import gaussian_filter as gg from digital_image_processing.filters import median_filter as med from digital_image_processing.filters import sobel_filter as sob from digital_image_processing.resize import resize as rs img = imread(r"digital_image_processing/image_data/lena_small.jpg") gray = cvtColor(img, COLOR_BGR2GRAY) # Test: convert_to_negative() def test_convert_to_negative(): negative_img = cn.convert_to_negative(img) # assert negative_img array for at least one True assert negative_img.any() # Test: change_contrast() def test_change_contrast(): with Image.open("digital_image_processing/image_data/lena_small.jpg") as img: # Work around assertion for response assert str(cc.change_contrast(img, 110)).startswith( "<PIL.Image.Image image mode=RGB size=100x100 at" ) # canny.gen_gaussian_kernel() def test_gen_gaussian_kernel(): resp = canny.gen_gaussian_kernel(9, sigma=1.4) # Assert ambiguous array assert resp.all() # canny.py def test_canny(): canny_img = imread("digital_image_processing/image_data/lena_small.jpg", 0) # assert ambiguous array for all == True assert canny_img.all() canny_array = canny.canny(canny_img) # assert canny array for at least one True assert canny_array.any() # filters/gaussian_filter.py def test_gen_gaussian_kernel_filter(): assert gg.gaussian_filter(gray, 5, sigma=0.9).all() def test_convolve_filter(): # laplace diagonals Laplace = array([[0.25, 0.5, 0.25], [0.5, -3, 0.5], [0.25, 0.5, 0.25]]) res = conv.img_convolve(gray, Laplace).astype(uint8) assert res.any() def test_median_filter(): assert med.median_filter(gray, 3).any() def test_sobel_filter(): grad, theta = sob.sobel_filter(gray) assert grad.any() and theta.any() def test_sepia(): sepia = sp.make_sepia(img, 20) assert sepia.all() def test_burkes(file_path: str = "digital_image_processing/image_data/lena_small.jpg"): burkes = bs.Burkes(imread(file_path, 1), 120) burkes.process() assert burkes.output_img.any() def test_nearest_neighbour( file_path: str = "digital_image_processing/image_data/lena_small.jpg", ): nn = rs.NearestNeighbour(imread(file_path, 1), 400, 200) nn.process() assert nn.output.any()
""" PyTest's for Digital Image Processing """ from cv2 import COLOR_BGR2GRAY, cvtColor, imread from numpy import array, uint8 from PIL import Image from digital_image_processing import change_contrast as cc from digital_image_processing import convert_to_negative as cn from digital_image_processing import sepia as sp from digital_image_processing.dithering import burkes as bs from digital_image_processing.edge_detection import canny as canny from digital_image_processing.filters import convolve as conv from digital_image_processing.filters import gaussian_filter as gg from digital_image_processing.filters import median_filter as med from digital_image_processing.filters import sobel_filter as sob from digital_image_processing.resize import resize as rs img = imread(r"digital_image_processing/image_data/lena_small.jpg") gray = cvtColor(img, COLOR_BGR2GRAY) # Test: convert_to_negative() def test_convert_to_negative(): negative_img = cn.convert_to_negative(img) # assert negative_img array for at least one True assert negative_img.any() # Test: change_contrast() def test_change_contrast(): with Image.open("digital_image_processing/image_data/lena_small.jpg") as img: # Work around assertion for response assert str(cc.change_contrast(img, 110)).startswith( "<PIL.Image.Image image mode=RGB size=100x100 at" ) # canny.gen_gaussian_kernel() def test_gen_gaussian_kernel(): resp = canny.gen_gaussian_kernel(9, sigma=1.4) # Assert ambiguous array assert resp.all() # canny.py def test_canny(): canny_img = imread("digital_image_processing/image_data/lena_small.jpg", 0) # assert ambiguous array for all == True assert canny_img.all() canny_array = canny.canny(canny_img) # assert canny array for at least one True assert canny_array.any() # filters/gaussian_filter.py def test_gen_gaussian_kernel_filter(): assert gg.gaussian_filter(gray, 5, sigma=0.9).all() def test_convolve_filter(): # laplace diagonals Laplace = array([[0.25, 0.5, 0.25], [0.5, -3, 0.5], [0.25, 0.5, 0.25]]) res = conv.img_convolve(gray, Laplace).astype(uint8) assert res.any() def test_median_filter(): assert med.median_filter(gray, 3).any() def test_sobel_filter(): grad, theta = sob.sobel_filter(gray) assert grad.any() and theta.any() def test_sepia(): sepia = sp.make_sepia(img, 20) assert sepia.all() def test_burkes(file_path: str = "digital_image_processing/image_data/lena_small.jpg"): burkes = bs.Burkes(imread(file_path, 1), 120) burkes.process() assert burkes.output_img.any() def test_nearest_neighbour( file_path: str = "digital_image_processing/image_data/lena_small.jpg", ): nn = rs.NearestNeighbour(imread(file_path, 1), 400, 200) nn.process() assert nn.output.any()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from collections import Counter import numpy as np from sklearn import datasets from sklearn.model_selection import train_test_split data = datasets.load_iris() X = np.array(data["data"]) y = np.array(data["target"]) classes = data["target_names"] X_train, X_test, y_train, y_test = train_test_split(X, y) def euclidean_distance(a, b): """ Gives the euclidean distance between two points >>> euclidean_distance([0, 0], [3, 4]) 5.0 >>> euclidean_distance([1, 2, 3], [1, 8, 11]) 10.0 """ return np.linalg.norm(np.array(a) - np.array(b)) def classifier(train_data, train_target, classes, point, k=5): """ Classifies the point using the KNN algorithm k closest points are found (ranked in ascending order of euclidean distance) Params: :train_data: Set of points that are classified into two or more classes :train_target: List of classes in the order of train_data points :classes: Labels of the classes :point: The data point that needs to be classified >>> X_train = [[0, 0], [1, 0], [0, 1], [0.5, 0.5], [3, 3], [2, 3], [3, 2]] >>> y_train = [0, 0, 0, 0, 1, 1, 1] >>> classes = ['A','B']; point = [1.2,1.2] >>> classifier(X_train, y_train, classes,point) 'A' """ data = zip(train_data, train_target) # List of distances of all points from the point to be classified distances = [] for data_point in data: distance = euclidean_distance(data_point[0], point) distances.append((distance, data_point[1])) # Choosing 'k' points with the least distances. votes = [i[1] for i in sorted(distances)[:k]] # Most commonly occurring class among them # is the class into which the point is classified result = Counter(votes).most_common(1)[0][0] return classes[result] if __name__ == "__main__": print(classifier(X_train, y_train, classes, [4.4, 3.1, 1.3, 1.4]))
from collections import Counter import numpy as np from sklearn import datasets from sklearn.model_selection import train_test_split data = datasets.load_iris() X = np.array(data["data"]) y = np.array(data["target"]) classes = data["target_names"] X_train, X_test, y_train, y_test = train_test_split(X, y) def euclidean_distance(a, b): """ Gives the euclidean distance between two points >>> euclidean_distance([0, 0], [3, 4]) 5.0 >>> euclidean_distance([1, 2, 3], [1, 8, 11]) 10.0 """ return np.linalg.norm(np.array(a) - np.array(b)) def classifier(train_data, train_target, classes, point, k=5): """ Classifies the point using the KNN algorithm k closest points are found (ranked in ascending order of euclidean distance) Params: :train_data: Set of points that are classified into two or more classes :train_target: List of classes in the order of train_data points :classes: Labels of the classes :point: The data point that needs to be classified >>> X_train = [[0, 0], [1, 0], [0, 1], [0.5, 0.5], [3, 3], [2, 3], [3, 2]] >>> y_train = [0, 0, 0, 0, 1, 1, 1] >>> classes = ['A','B']; point = [1.2,1.2] >>> classifier(X_train, y_train, classes,point) 'A' """ data = zip(train_data, train_target) # List of distances of all points from the point to be classified distances = [] for data_point in data: distance = euclidean_distance(data_point[0], point) distances.append((distance, data_point[1])) # Choosing 'k' points with the least distances. votes = [i[1] for i in sorted(distances)[:k]] # Most commonly occurring class among them # is the class into which the point is classified result = Counter(votes).most_common(1)[0][0] return classes[result] if __name__ == "__main__": print(classifier(X_train, y_train, classes, [4.4, 3.1, 1.3, 1.4]))
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Numerical integration or quadrature for a smooth function f with known values at x_i This method is the classical approach of suming 'Equally Spaced Abscissas' method 1: "extended trapezoidal rule" """ def method_1(boundary, steps): # "extended trapezoidal rule" # int(f) = dx/2 * (f1 + 2f2 + ... + fn) h = (boundary[1] - boundary[0]) / steps a = boundary[0] b = boundary[1] x_i = make_points(a, b, h) y = 0.0 y += (h / 2.0) * f(a) for i in x_i: # print(i) y += h * f(i) y += (h / 2.0) * f(b) return y def make_points(a, b, h): x = a + h while x < (b - h): yield x x = x + h def f(x): # enter your function here y = (x - 0) * (x - 0) return y def main(): a = 0.0 # Lower bound of integration b = 1.0 # Upper bound of integration steps = 10.0 # define number of steps or resolution boundary = [a, b] # define boundary of integration y = method_1(boundary, steps) print(f"y = {y}") if __name__ == "__main__": main()
""" Numerical integration or quadrature for a smooth function f with known values at x_i This method is the classical approach of suming 'Equally Spaced Abscissas' method 1: "extended trapezoidal rule" """ def method_1(boundary, steps): # "extended trapezoidal rule" # int(f) = dx/2 * (f1 + 2f2 + ... + fn) h = (boundary[1] - boundary[0]) / steps a = boundary[0] b = boundary[1] x_i = make_points(a, b, h) y = 0.0 y += (h / 2.0) * f(a) for i in x_i: # print(i) y += h * f(i) y += (h / 2.0) * f(b) return y def make_points(a, b, h): x = a + h while x < (b - h): yield x x = x + h def f(x): # enter your function here y = (x - 0) * (x - 0) return y def main(): a = 0.0 # Lower bound of integration b = 1.0 # Upper bound of integration steps = 10.0 # define number of steps or resolution boundary = [a, b] # define boundary of integration y = method_1(boundary, steps) print(f"y = {y}") if __name__ == "__main__": main()
-1
TheAlgorithms/Python
5,570
[mypy] annotate `compression`
### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
ErwinJunge
"2021-10-23T19:54:11Z"
"2021-10-26T10:29:28Z"
de07245c170f2007cef415fa4114be870078988e
e49d8e3af427353c18fe5f1afb0927e3e8d6461c
[mypy] annotate `compression`. ### **Describe your change:** Fixed missing type annotation for [compression](https://github.com/TheAlgorithms/Python/blob/master/compression) directory. It now passes `mypy --strict`. Related to #4052 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
5,566
[mypy] Fix type annotations for stack.py
``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
archaengel
"2021-10-23T18:21:43Z"
"2021-10-26T18:33:08Z"
582f57f41fb9d36ae8fe4d49c98775877b9013b7
c0ed031b3fcf47736f98dfd89e2588dbffceadde
[mypy] Fix type annotations for stack.py. ``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from .stack import Stack def balanced_parentheses(parentheses: str) -> bool: """Use a stack to check if a string of parentheses is balanced. >>> balanced_parentheses("([]{})") True >>> balanced_parentheses("[()]{}{[()()]()}") True >>> balanced_parentheses("[(])") False >>> balanced_parentheses("1+2*3-4") True >>> balanced_parentheses("") True """ stack = Stack() bracket_pairs = {"(": ")", "[": "]", "{": "}"} for bracket in parentheses: if bracket in bracket_pairs: stack.push(bracket) elif bracket in (")", "]", "}"): if stack.is_empty() or bracket_pairs[stack.pop()] != bracket: return False return stack.is_empty() if __name__ == "__main__": from doctest import testmod testmod() examples = ["((()))", "((())", "(()))"] print("Balanced parentheses demonstration:\n") for example in examples: not_str = "" if balanced_parentheses(example) else "not " print(f"{example} is {not_str}balanced")
from .stack import Stack def balanced_parentheses(parentheses: str) -> bool: """Use a stack to check if a string of parentheses is balanced. >>> balanced_parentheses("([]{})") True >>> balanced_parentheses("[()]{}{[()()]()}") True >>> balanced_parentheses("[(])") False >>> balanced_parentheses("1+2*3-4") True >>> balanced_parentheses("") True """ stack: Stack[str] = Stack() bracket_pairs = {"(": ")", "[": "]", "{": "}"} for bracket in parentheses: if bracket in bracket_pairs: stack.push(bracket) elif bracket in (")", "]", "}"): if stack.is_empty() or bracket_pairs[stack.pop()] != bracket: return False return stack.is_empty() if __name__ == "__main__": from doctest import testmod testmod() examples = ["((()))", "((())", "(()))"] print("Balanced parentheses demonstration:\n") for example in examples: not_str = "" if balanced_parentheses(example) else "not " print(f"{example} is {not_str}balanced")
1
TheAlgorithms/Python
5,566
[mypy] Fix type annotations for stack.py
``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
archaengel
"2021-10-23T18:21:43Z"
"2021-10-26T18:33:08Z"
582f57f41fb9d36ae8fe4d49c98775877b9013b7
c0ed031b3fcf47736f98dfd89e2588dbffceadde
[mypy] Fix type annotations for stack.py. ``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Author: Alexander Joslin GitHub: github.com/echoaj Explanation: https://medium.com/@haleesammar/implemented-in-js-dijkstras-2-stack- algorithm-for-evaluating-mathematical-expressions-fc0837dae1ea We can use Dijkstra's two stack algorithm to solve an equation such as: (5 + ((4 * 2) * (2 + 3))) THESE ARE THE ALGORITHM'S RULES: RULE 1: Scan the expression from left to right. When an operand is encountered, push it onto the the operand stack. RULE 2: When an operator is encountered in the expression, push it onto the operator stack. RULE 3: When a left parenthesis is encountered in the expression, ignore it. RULE 4: When a right parenthesis is encountered in the expression, pop an operator off the operator stack. The two operands it must operate on must be the last two operands pushed onto the operand stack. We therefore pop the operand stack twice, perform the operation, and push the result back onto the operand stack so it will be available for use as an operand of the next operator popped off the operator stack. RULE 5: When the entire infix expression has been scanned, the value left on the operand stack represents the value of the expression. NOTE: It only works with whole numbers. """ __author__ = "Alexander Joslin" import operator as op from .stack import Stack def dijkstras_two_stack_algorithm(equation: str) -> int: """ DocTests >>> dijkstras_two_stack_algorithm("(5 + 3)") 8 >>> dijkstras_two_stack_algorithm("((9 - (2 + 9)) + (8 - 1))") 5 >>> dijkstras_two_stack_algorithm("((((3 - 2) - (2 + 3)) + (2 - 4)) + 3)") -3 :param equation: a string :return: result: an integer """ operators = {"*": op.mul, "/": op.truediv, "+": op.add, "-": op.sub} operand_stack = Stack() operator_stack = Stack() for i in equation: if i.isdigit(): # RULE 1 operand_stack.push(int(i)) elif i in operators: # RULE 2 operator_stack.push(i) elif i == ")": # RULE 4 opr = operator_stack.peek() operator_stack.pop() num1 = operand_stack.peek() operand_stack.pop() num2 = operand_stack.peek() operand_stack.pop() total = operators[opr](num2, num1) operand_stack.push(total) # RULE 5 return operand_stack.peek() if __name__ == "__main__": equation = "(5 + ((4 * 2) * (2 + 3)))" # answer = 45 print(f"{equation} = {dijkstras_two_stack_algorithm(equation)}")
""" Author: Alexander Joslin GitHub: github.com/echoaj Explanation: https://medium.com/@haleesammar/implemented-in-js-dijkstras-2-stack- algorithm-for-evaluating-mathematical-expressions-fc0837dae1ea We can use Dijkstra's two stack algorithm to solve an equation such as: (5 + ((4 * 2) * (2 + 3))) THESE ARE THE ALGORITHM'S RULES: RULE 1: Scan the expression from left to right. When an operand is encountered, push it onto the the operand stack. RULE 2: When an operator is encountered in the expression, push it onto the operator stack. RULE 3: When a left parenthesis is encountered in the expression, ignore it. RULE 4: When a right parenthesis is encountered in the expression, pop an operator off the operator stack. The two operands it must operate on must be the last two operands pushed onto the operand stack. We therefore pop the operand stack twice, perform the operation, and push the result back onto the operand stack so it will be available for use as an operand of the next operator popped off the operator stack. RULE 5: When the entire infix expression has been scanned, the value left on the operand stack represents the value of the expression. NOTE: It only works with whole numbers. """ __author__ = "Alexander Joslin" import operator as op from .stack import Stack def dijkstras_two_stack_algorithm(equation: str) -> int: """ DocTests >>> dijkstras_two_stack_algorithm("(5 + 3)") 8 >>> dijkstras_two_stack_algorithm("((9 - (2 + 9)) + (8 - 1))") 5 >>> dijkstras_two_stack_algorithm("((((3 - 2) - (2 + 3)) + (2 - 4)) + 3)") -3 :param equation: a string :return: result: an integer """ operators = {"*": op.mul, "/": op.truediv, "+": op.add, "-": op.sub} operand_stack: Stack[int] = Stack() operator_stack: Stack[str] = Stack() for i in equation: if i.isdigit(): # RULE 1 operand_stack.push(int(i)) elif i in operators: # RULE 2 operator_stack.push(i) elif i == ")": # RULE 4 opr = operator_stack.peek() operator_stack.pop() num1 = operand_stack.peek() operand_stack.pop() num2 = operand_stack.peek() operand_stack.pop() total = operators[opr](num2, num1) operand_stack.push(total) # RULE 5 return operand_stack.peek() if __name__ == "__main__": equation = "(5 + ((4 * 2) * (2 + 3)))" # answer = 45 print(f"{equation} = {dijkstras_two_stack_algorithm(equation)}")
1
TheAlgorithms/Python
5,566
[mypy] Fix type annotations for stack.py
``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
archaengel
"2021-10-23T18:21:43Z"
"2021-10-26T18:33:08Z"
582f57f41fb9d36ae8fe4d49c98775877b9013b7
c0ed031b3fcf47736f98dfd89e2588dbffceadde
[mypy] Fix type annotations for stack.py. ``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" https://en.wikipedia.org/wiki/Infix_notation https://en.wikipedia.org/wiki/Reverse_Polish_notation https://en.wikipedia.org/wiki/Shunting-yard_algorithm """ from .balanced_parentheses import balanced_parentheses from .stack import Stack def precedence(char: str) -> int: """ Return integer value representing an operator's precedence, or order of operation. https://en.wikipedia.org/wiki/Order_of_operations """ return {"+": 1, "-": 1, "*": 2, "/": 2, "^": 3}.get(char, -1) def infix_to_postfix(expression_str: str) -> str: """ >>> infix_to_postfix("(1*(2+3)+4))") Traceback (most recent call last): ... ValueError: Mismatched parentheses >>> infix_to_postfix("") '' >>> infix_to_postfix("3+2") '3 2 +' >>> infix_to_postfix("(3+4)*5-6") '3 4 + 5 * 6 -' >>> infix_to_postfix("(1+2)*3/4-5") '1 2 + 3 * 4 / 5 -' >>> infix_to_postfix("a+b*c+(d*e+f)*g") 'a b c * + d e * f + g * +' >>> infix_to_postfix("x^y/(5*z)+2") 'x y ^ 5 z * / 2 +' """ if not balanced_parentheses(expression_str): raise ValueError("Mismatched parentheses") stack = Stack() postfix = [] for char in expression_str: if char.isalpha() or char.isdigit(): postfix.append(char) elif char == "(": stack.push(char) elif char == ")": while not stack.is_empty() and stack.peek() != "(": postfix.append(stack.pop()) stack.pop() else: while not stack.is_empty() and precedence(char) <= precedence(stack.peek()): postfix.append(stack.pop()) stack.push(char) while not stack.is_empty(): postfix.append(stack.pop()) return " ".join(postfix) if __name__ == "__main__": from doctest import testmod testmod() expression = "a+b*(c^d-e)^(f+g*h)-i" print("Infix to Postfix Notation demonstration:\n") print("Infix notation: " + expression) print("Postfix notation: " + infix_to_postfix(expression))
""" https://en.wikipedia.org/wiki/Infix_notation https://en.wikipedia.org/wiki/Reverse_Polish_notation https://en.wikipedia.org/wiki/Shunting-yard_algorithm """ from .balanced_parentheses import balanced_parentheses from .stack import Stack def precedence(char: str) -> int: """ Return integer value representing an operator's precedence, or order of operation. https://en.wikipedia.org/wiki/Order_of_operations """ return {"+": 1, "-": 1, "*": 2, "/": 2, "^": 3}.get(char, -1) def infix_to_postfix(expression_str: str) -> str: """ >>> infix_to_postfix("(1*(2+3)+4))") Traceback (most recent call last): ... ValueError: Mismatched parentheses >>> infix_to_postfix("") '' >>> infix_to_postfix("3+2") '3 2 +' >>> infix_to_postfix("(3+4)*5-6") '3 4 + 5 * 6 -' >>> infix_to_postfix("(1+2)*3/4-5") '1 2 + 3 * 4 / 5 -' >>> infix_to_postfix("a+b*c+(d*e+f)*g") 'a b c * + d e * f + g * +' >>> infix_to_postfix("x^y/(5*z)+2") 'x y ^ 5 z * / 2 +' """ if not balanced_parentheses(expression_str): raise ValueError("Mismatched parentheses") stack: Stack[str] = Stack() postfix = [] for char in expression_str: if char.isalpha() or char.isdigit(): postfix.append(char) elif char == "(": stack.push(char) elif char == ")": while not stack.is_empty() and stack.peek() != "(": postfix.append(stack.pop()) stack.pop() else: while not stack.is_empty() and precedence(char) <= precedence(stack.peek()): postfix.append(stack.pop()) stack.push(char) while not stack.is_empty(): postfix.append(stack.pop()) return " ".join(postfix) if __name__ == "__main__": from doctest import testmod testmod() expression = "a+b*(c^d-e)^(f+g*h)-i" print("Infix to Postfix Notation demonstration:\n") print("Infix notation: " + expression) print("Postfix notation: " + infix_to_postfix(expression))
1
TheAlgorithms/Python
5,566
[mypy] Fix type annotations for stack.py
``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
archaengel
"2021-10-23T18:21:43Z"
"2021-10-26T18:33:08Z"
582f57f41fb9d36ae8fe4d49c98775877b9013b7
c0ed031b3fcf47736f98dfd89e2588dbffceadde
[mypy] Fix type annotations for stack.py. ``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations class StackOverflowError(BaseException): pass class StackUnderflowError(BaseException): pass class Stack: """A stack is an abstract data type that serves as a collection of elements with two principal operations: push() and pop(). push() adds an element to the top of the stack, and pop() removes an element from the top of a stack. The order in which elements come off of a stack are Last In, First Out (LIFO). https://en.wikipedia.org/wiki/Stack_(abstract_data_type) """ def __init__(self, limit: int = 10): self.stack: list[int] = [] self.limit = limit def __bool__(self) -> bool: return bool(self.stack) def __str__(self) -> str: return str(self.stack) def push(self, data): """Push an element to the top of the stack.""" if len(self.stack) >= self.limit: raise StackOverflowError self.stack.append(data) def pop(self): """ Pop an element off of the top of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack.pop() def peek(self): """ Peek at the top-most element of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack[-1] def is_empty(self) -> bool: """Check if a stack is empty.""" return not bool(self.stack) def is_full(self) -> bool: return self.size() == self.limit def size(self) -> int: """Return the size of the stack.""" return len(self.stack) def __contains__(self, item) -> bool: """Check if item is in stack""" return item in self.stack def test_stack() -> None: """ >>> test_stack() """ stack = Stack(10) assert bool(stack) is False assert stack.is_empty() is True assert stack.is_full() is False assert str(stack) == "[]" try: _ = stack.pop() assert False # This should not happen except StackUnderflowError: assert True # This should happen try: _ = stack.peek() assert False # This should not happen except StackUnderflowError: assert True # This should happen for i in range(10): assert stack.size() == i stack.push(i) assert bool(stack) assert not stack.is_empty() assert stack.is_full() assert str(stack) == str(list(range(10))) assert stack.pop() == 9 assert stack.peek() == 8 stack.push(100) assert str(stack) == str([0, 1, 2, 3, 4, 5, 6, 7, 8, 100]) try: stack.push(200) assert False # This should not happen except StackOverflowError: assert True # This should happen assert not stack.is_empty() assert stack.size() == 10 assert 5 in stack assert 55 not in stack if __name__ == "__main__": test_stack()
from __future__ import annotations from typing import Generic, TypeVar T = TypeVar("T") class StackOverflowError(BaseException): pass class StackUnderflowError(BaseException): pass class Stack(Generic[T]): """A stack is an abstract data type that serves as a collection of elements with two principal operations: push() and pop(). push() adds an element to the top of the stack, and pop() removes an element from the top of a stack. The order in which elements come off of a stack are Last In, First Out (LIFO). https://en.wikipedia.org/wiki/Stack_(abstract_data_type) """ def __init__(self, limit: int = 10): self.stack: list[T] = [] self.limit = limit def __bool__(self) -> bool: return bool(self.stack) def __str__(self) -> str: return str(self.stack) def push(self, data: T) -> None: """Push an element to the top of the stack.""" if len(self.stack) >= self.limit: raise StackOverflowError self.stack.append(data) def pop(self) -> T: """ Pop an element off of the top of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack.pop() def peek(self) -> T: """ Peek at the top-most element of the stack. >>> Stack().pop() Traceback (most recent call last): ... data_structures.stacks.stack.StackUnderflowError """ if not self.stack: raise StackUnderflowError return self.stack[-1] def is_empty(self) -> bool: """Check if a stack is empty.""" return not bool(self.stack) def is_full(self) -> bool: return self.size() == self.limit def size(self) -> int: """Return the size of the stack.""" return len(self.stack) def __contains__(self, item: T) -> bool: """Check if item is in stack""" return item in self.stack def test_stack() -> None: """ >>> test_stack() """ stack: Stack[int] = Stack(10) assert bool(stack) is False assert stack.is_empty() is True assert stack.is_full() is False assert str(stack) == "[]" try: _ = stack.pop() assert False # This should not happen except StackUnderflowError: assert True # This should happen try: _ = stack.peek() assert False # This should not happen except StackUnderflowError: assert True # This should happen for i in range(10): assert stack.size() == i stack.push(i) assert bool(stack) assert not stack.is_empty() assert stack.is_full() assert str(stack) == str(list(range(10))) assert stack.pop() == 9 assert stack.peek() == 8 stack.push(100) assert str(stack) == str([0, 1, 2, 3, 4, 5, 6, 7, 8, 100]) try: stack.push(200) assert False # This should not happen except StackOverflowError: assert True # This should happen assert not stack.is_empty() assert stack.size() == 10 assert 5 in stack assert 55 not in stack if __name__ == "__main__": test_stack()
1
TheAlgorithms/Python
5,566
[mypy] Fix type annotations for stack.py
``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
archaengel
"2021-10-23T18:21:43Z"
"2021-10-26T18:33:08Z"
582f57f41fb9d36ae8fe4d49c98775877b9013b7
c0ed031b3fcf47736f98dfd89e2588dbffceadde
[mypy] Fix type annotations for stack.py. ``` $ git checkout mypy-fix-stacks-stack Switched to branch 'mypy-fix-stacks-stack' $ mypy --ignore-missing-imports data_structures/stacks/stack.py --strict Success: no issues found in 1 source file $ mypy --ignore-missing-imports data_structures/stacks --strict > after.txt $ git checkout master Switched to branch 'master' $ mypy --ignore-missing-imports data_structures/stacks --strict > before.txt $ diff before.txt after.txt 39,49d38 < data_structures/stacks/stack.py:31: error: Function is missing a type annotation < data_structures/stacks/stack.py:37: error: Function is missing a return type annotation < data_structures/stacks/stack.py:50: error: Function is missing a return type annotation < data_structures/stacks/stack.py:74: error: Function is missing a type annotation for one or more arguments < data_structures/stacks/stack.py:90: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:96: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:103: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:109: error: Call to untyped function "pop" in typed context < data_structures/stacks/stack.py:110: error: Call to untyped function "peek" in typed context < data_structures/stacks/stack.py:112: error: Call to untyped function "push" in typed context < data_structures/stacks/stack.py:116: error: Call to untyped function "push" in typed context 52,62d40 < data_structures/stacks/dijkstras_two_stack_algorithm.py:60: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:63: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:66: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:67: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:68: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:69: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:70: error: Call to untyped function "peek" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:71: error: Call to untyped function "pop" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:74: error: Call to untyped function "push" in typed context < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Returning Any from function declared to return "int" < data_structures/stacks/dijkstras_two_stack_algorithm.py:77: error: Call to untyped function "peek" in typed context 75,85c53 < data_structures/stacks/balanced_parentheses.py:21: error: Call to untyped function "push" in typed context < data_structures/stacks/balanced_parentheses.py:23: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:47: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:49: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:50: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:51: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:53: error: Call to untyped function "peek" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:54: error: Call to untyped function "pop" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:55: error: Call to untyped function "push" in typed context < data_structures/stacks/infix_to_postfix_conversion.py:57: error: Call to untyped function "pop" in typed context < Found 82 errors in 12 files (checked 13 source files) --- > Found 50 errors in 8 files (checked 13 source files) ``` Related to #4052 ### **Describe your change:** Add generic type annotations to functions in `stack.py`. Add type annotations to variables of `Stack` class. * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### **Checklist:** * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1