repo_name
stringclasses 1
value | pr_number
int64 4.12k
11.2k
| pr_title
stringlengths 9
107
| pr_description
stringlengths 107
5.48k
| author
stringlengths 4
18
| date_created
unknown | date_merged
unknown | previous_commit
stringlengths 40
40
| pr_commit
stringlengths 40
40
| query
stringlengths 118
5.52k
| before_content
stringlengths 0
7.93M
| after_content
stringlengths 0
7.93M
| label
int64 -1
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | def is_contains_unique_chars(input_str: str) -> bool:
"""
Check if all characters in the string is unique or not.
>>> is_contains_unique_chars("I_love.py")
True
>>> is_contains_unique_chars("I don't love Python")
False
Time complexity: O(n)
Space compexity: O(1) 19320 bytes as we are having 144697 characters in unicode
"""
# Each bit will represent each unicode character
# For example 65th bit representing 'A'
# https://stackoverflow.com/a/12811293
bitmap = 0
for ch in input_str:
ch_unicode = ord(ch)
ch_bit_index_on = pow(2, ch_unicode)
# If we already turned on bit for current character's unicode
if bitmap >> ch_unicode & 1 == 1:
return False
bitmap |= ch_bit_index_on
return True
if __name__ == "__main__":
import doctest
doctest.testmod()
| def is_contains_unique_chars(input_str: str) -> bool:
"""
Check if all characters in the string is unique or not.
>>> is_contains_unique_chars("I_love.py")
True
>>> is_contains_unique_chars("I don't love Python")
False
Time complexity: O(n)
Space compexity: O(1) 19320 bytes as we are having 144697 characters in unicode
"""
# Each bit will represent each unicode character
# For example 65th bit representing 'A'
# https://stackoverflow.com/a/12811293
bitmap = 0
for ch in input_str:
ch_unicode = ord(ch)
ch_bit_index_on = pow(2, ch_unicode)
# If we already turned on bit for current character's unicode
if bitmap >> ch_unicode & 1 == 1:
return False
bitmap |= ch_bit_index_on
return True
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
The A* algorithm combines features of uniform-cost search and pure
heuristic search to efficiently compute optimal solutions.
A* algorithm is a best-first search algorithm in which the cost
associated with a node is f(n) = g(n) + h(n),
where g(n) is the cost of the path from the initial state to node n and
h(n) is the heuristic estimate or the cost or a path
from node n to a goal.A* algorithm introduces a heuristic into a
regular graph-searching algorithm,
essentially planning ahead at each step so a more optimal decision
is made.A* also known as the algorithm with brains
"""
import numpy as np
class Cell:
"""
Class cell represents a cell in the world which have the property
position : The position of the represented by tupleof x and y
coordinates initially set to (0,0)
parent : This contains the parent cell object which we visited
before arrinving this cell
g,h,f : The parameters for constructing the heuristic function
which can be any function. for simplicity used line
distance
"""
def __init__(self):
self.position = (0, 0)
self.parent = None
self.g = 0
self.h = 0
self.f = 0
"""
overrides equals method because otherwise cell assign will give
wrong results
"""
def __eq__(self, cell):
return self.position == cell.position
def showcell(self):
print(self.position)
class Gridworld:
"""
Gridworld class represents the external world here a grid M*M
matrix
world_size: create a numpy array with the given world_size default is 5
"""
def __init__(self, world_size=(5, 5)):
self.w = np.zeros(world_size)
self.world_x_limit = world_size[0]
self.world_y_limit = world_size[1]
def show(self):
print(self.w)
def get_neigbours(self, cell):
"""
Return the neighbours of cell
"""
neughbour_cord = [
(-1, -1),
(-1, 0),
(-1, 1),
(0, -1),
(0, 1),
(1, -1),
(1, 0),
(1, 1),
]
current_x = cell.position[0]
current_y = cell.position[1]
neighbours = []
for n in neughbour_cord:
x = current_x + n[0]
y = current_y + n[1]
if 0 <= x < self.world_x_limit and 0 <= y < self.world_y_limit:
c = Cell()
c.position = (x, y)
c.parent = cell
neighbours.append(c)
return neighbours
def astar(world, start, goal):
"""
Implementation of a start algorithm
world : Object of the world object
start : Object of the cell as start position
stop : Object of the cell as goal position
>>> p = Gridworld()
>>> start = Cell()
>>> start.position = (0,0)
>>> goal = Cell()
>>> goal.position = (4,4)
>>> astar(p, start, goal)
[(0, 0), (1, 1), (2, 2), (3, 3), (4, 4)]
"""
_open = []
_closed = []
_open.append(start)
while _open:
min_f = np.argmin([n.f for n in _open])
current = _open[min_f]
_closed.append(_open.pop(min_f))
if current == goal:
break
for n in world.get_neigbours(current):
for c in _closed:
if c == n:
continue
n.g = current.g + 1
x1, y1 = n.position
x2, y2 = goal.position
n.h = (y2 - y1) ** 2 + (x2 - x1) ** 2
n.f = n.h + n.g
for c in _open:
if c == n and c.f < n.f:
continue
_open.append(n)
path = []
while current.parent is not None:
path.append(current.position)
current = current.parent
path.append(current.position)
return path[::-1]
if __name__ == "__main__":
world = Gridworld()
# stat position and Goal
start = Cell()
start.position = (0, 0)
goal = Cell()
goal.position = (4, 4)
print(f"path from {start.position} to {goal.position}")
s = astar(world, start, goal)
# Just for visual reasons
for i in s:
world.w[i] = 1
print(world.w)
| """
The A* algorithm combines features of uniform-cost search and pure
heuristic search to efficiently compute optimal solutions.
A* algorithm is a best-first search algorithm in which the cost
associated with a node is f(n) = g(n) + h(n),
where g(n) is the cost of the path from the initial state to node n and
h(n) is the heuristic estimate or the cost or a path
from node n to a goal.A* algorithm introduces a heuristic into a
regular graph-searching algorithm,
essentially planning ahead at each step so a more optimal decision
is made.A* also known as the algorithm with brains
"""
import numpy as np
class Cell:
"""
Class cell represents a cell in the world which have the property
position : The position of the represented by tupleof x and y
coordinates initially set to (0,0)
parent : This contains the parent cell object which we visited
before arrinving this cell
g,h,f : The parameters for constructing the heuristic function
which can be any function. for simplicity used line
distance
"""
def __init__(self):
self.position = (0, 0)
self.parent = None
self.g = 0
self.h = 0
self.f = 0
"""
overrides equals method because otherwise cell assign will give
wrong results
"""
def __eq__(self, cell):
return self.position == cell.position
def showcell(self):
print(self.position)
class Gridworld:
"""
Gridworld class represents the external world here a grid M*M
matrix
world_size: create a numpy array with the given world_size default is 5
"""
def __init__(self, world_size=(5, 5)):
self.w = np.zeros(world_size)
self.world_x_limit = world_size[0]
self.world_y_limit = world_size[1]
def show(self):
print(self.w)
def get_neigbours(self, cell):
"""
Return the neighbours of cell
"""
neughbour_cord = [
(-1, -1),
(-1, 0),
(-1, 1),
(0, -1),
(0, 1),
(1, -1),
(1, 0),
(1, 1),
]
current_x = cell.position[0]
current_y = cell.position[1]
neighbours = []
for n in neughbour_cord:
x = current_x + n[0]
y = current_y + n[1]
if 0 <= x < self.world_x_limit and 0 <= y < self.world_y_limit:
c = Cell()
c.position = (x, y)
c.parent = cell
neighbours.append(c)
return neighbours
def astar(world, start, goal):
"""
Implementation of a start algorithm
world : Object of the world object
start : Object of the cell as start position
stop : Object of the cell as goal position
>>> p = Gridworld()
>>> start = Cell()
>>> start.position = (0,0)
>>> goal = Cell()
>>> goal.position = (4,4)
>>> astar(p, start, goal)
[(0, 0), (1, 1), (2, 2), (3, 3), (4, 4)]
"""
_open = []
_closed = []
_open.append(start)
while _open:
min_f = np.argmin([n.f for n in _open])
current = _open[min_f]
_closed.append(_open.pop(min_f))
if current == goal:
break
for n in world.get_neigbours(current):
for c in _closed:
if c == n:
continue
n.g = current.g + 1
x1, y1 = n.position
x2, y2 = goal.position
n.h = (y2 - y1) ** 2 + (x2 - x1) ** 2
n.f = n.h + n.g
for c in _open:
if c == n and c.f < n.f:
continue
_open.append(n)
path = []
while current.parent is not None:
path.append(current.position)
current = current.parent
path.append(current.position)
return path[::-1]
if __name__ == "__main__":
world = Gridworld()
# stat position and Goal
start = Cell()
start.position = (0, 0)
goal = Cell()
goal.position = (4, 4)
print(f"path from {start.position} to {goal.position}")
s = astar(world, start, goal)
# Just for visual reasons
for i in s:
world.w[i] = 1
print(world.w)
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | from __future__ import annotations
import math
class SegmentTree:
def __init__(self, size: int) -> None:
self.size = size
# approximate the overall size of segment tree with given value
self.segment_tree = [0 for i in range(0, 4 * size)]
# create array to store lazy update
self.lazy = [0 for i in range(0, 4 * size)]
self.flag = [0 for i in range(0, 4 * size)] # flag for lazy update
def left(self, idx: int) -> int:
"""
>>> segment_tree = SegmentTree(15)
>>> segment_tree.left(1)
2
>>> segment_tree.left(2)
4
>>> segment_tree.left(12)
24
"""
return idx * 2
def right(self, idx: int) -> int:
"""
>>> segment_tree = SegmentTree(15)
>>> segment_tree.right(1)
3
>>> segment_tree.right(2)
5
>>> segment_tree.right(12)
25
"""
return idx * 2 + 1
def build(
self, idx: int, left_element: int, right_element: int, A: list[int]
) -> None:
if left_element == right_element:
self.segment_tree[idx] = A[left_element - 1]
else:
mid = (left_element + right_element) // 2
self.build(self.left(idx), left_element, mid, A)
self.build(self.right(idx), mid + 1, right_element, A)
self.segment_tree[idx] = max(
self.segment_tree[self.left(idx)], self.segment_tree[self.right(idx)]
)
def update(
self, idx: int, left_element: int, right_element: int, a: int, b: int, val: int
) -> bool:
"""
update with O(lg n) (Normal segment tree without lazy update will take O(nlg n)
for each update)
update(1, 1, size, a, b, v) for update val v to [a,b]
"""
if self.flag[idx] is True:
self.segment_tree[idx] = self.lazy[idx]
self.flag[idx] = False
if left_element != right_element:
self.lazy[self.left(idx)] = self.lazy[idx]
self.lazy[self.right(idx)] = self.lazy[idx]
self.flag[self.left(idx)] = True
self.flag[self.right(idx)] = True
if right_element < a or left_element > b:
return True
if left_element >= a and right_element <= b:
self.segment_tree[idx] = val
if left_element != right_element:
self.lazy[self.left(idx)] = val
self.lazy[self.right(idx)] = val
self.flag[self.left(idx)] = True
self.flag[self.right(idx)] = True
return True
mid = (left_element + right_element) // 2
self.update(self.left(idx), left_element, mid, a, b, val)
self.update(self.right(idx), mid + 1, right_element, a, b, val)
self.segment_tree[idx] = max(
self.segment_tree[self.left(idx)], self.segment_tree[self.right(idx)]
)
return True
# query with O(lg n)
def query(
self, idx: int, left_element: int, right_element: int, a: int, b: int
) -> int | float:
"""
query(1, 1, size, a, b) for query max of [a,b]
>>> A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8]
>>> segment_tree = SegmentTree(15)
>>> segment_tree.build(1, 1, 15, A)
>>> segment_tree.query(1, 1, 15, 4, 6)
7
>>> segment_tree.query(1, 1, 15, 7, 11)
14
>>> segment_tree.query(1, 1, 15, 7, 12)
15
"""
if self.flag[idx] is True:
self.segment_tree[idx] = self.lazy[idx]
self.flag[idx] = False
if left_element != right_element:
self.lazy[self.left(idx)] = self.lazy[idx]
self.lazy[self.right(idx)] = self.lazy[idx]
self.flag[self.left(idx)] = True
self.flag[self.right(idx)] = True
if right_element < a or left_element > b:
return -math.inf
if left_element >= a and right_element <= b:
return self.segment_tree[idx]
mid = (left_element + right_element) // 2
q1 = self.query(self.left(idx), left_element, mid, a, b)
q2 = self.query(self.right(idx), mid + 1, right_element, a, b)
return max(q1, q2)
def __str__(self) -> str:
return str([self.query(1, 1, self.size, i, i) for i in range(1, self.size + 1)])
if __name__ == "__main__":
A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8]
size = 15
segt = SegmentTree(size)
segt.build(1, 1, size, A)
print(segt.query(1, 1, size, 4, 6))
print(segt.query(1, 1, size, 7, 11))
print(segt.query(1, 1, size, 7, 12))
segt.update(1, 1, size, 1, 3, 111)
print(segt.query(1, 1, size, 1, 15))
segt.update(1, 1, size, 7, 8, 235)
print(segt)
| from __future__ import annotations
import math
class SegmentTree:
def __init__(self, size: int) -> None:
self.size = size
# approximate the overall size of segment tree with given value
self.segment_tree = [0 for i in range(0, 4 * size)]
# create array to store lazy update
self.lazy = [0 for i in range(0, 4 * size)]
self.flag = [0 for i in range(0, 4 * size)] # flag for lazy update
def left(self, idx: int) -> int:
"""
>>> segment_tree = SegmentTree(15)
>>> segment_tree.left(1)
2
>>> segment_tree.left(2)
4
>>> segment_tree.left(12)
24
"""
return idx * 2
def right(self, idx: int) -> int:
"""
>>> segment_tree = SegmentTree(15)
>>> segment_tree.right(1)
3
>>> segment_tree.right(2)
5
>>> segment_tree.right(12)
25
"""
return idx * 2 + 1
def build(
self, idx: int, left_element: int, right_element: int, A: list[int]
) -> None:
if left_element == right_element:
self.segment_tree[idx] = A[left_element - 1]
else:
mid = (left_element + right_element) // 2
self.build(self.left(idx), left_element, mid, A)
self.build(self.right(idx), mid + 1, right_element, A)
self.segment_tree[idx] = max(
self.segment_tree[self.left(idx)], self.segment_tree[self.right(idx)]
)
def update(
self, idx: int, left_element: int, right_element: int, a: int, b: int, val: int
) -> bool:
"""
update with O(lg n) (Normal segment tree without lazy update will take O(nlg n)
for each update)
update(1, 1, size, a, b, v) for update val v to [a,b]
"""
if self.flag[idx] is True:
self.segment_tree[idx] = self.lazy[idx]
self.flag[idx] = False
if left_element != right_element:
self.lazy[self.left(idx)] = self.lazy[idx]
self.lazy[self.right(idx)] = self.lazy[idx]
self.flag[self.left(idx)] = True
self.flag[self.right(idx)] = True
if right_element < a or left_element > b:
return True
if left_element >= a and right_element <= b:
self.segment_tree[idx] = val
if left_element != right_element:
self.lazy[self.left(idx)] = val
self.lazy[self.right(idx)] = val
self.flag[self.left(idx)] = True
self.flag[self.right(idx)] = True
return True
mid = (left_element + right_element) // 2
self.update(self.left(idx), left_element, mid, a, b, val)
self.update(self.right(idx), mid + 1, right_element, a, b, val)
self.segment_tree[idx] = max(
self.segment_tree[self.left(idx)], self.segment_tree[self.right(idx)]
)
return True
# query with O(lg n)
def query(
self, idx: int, left_element: int, right_element: int, a: int, b: int
) -> int | float:
"""
query(1, 1, size, a, b) for query max of [a,b]
>>> A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8]
>>> segment_tree = SegmentTree(15)
>>> segment_tree.build(1, 1, 15, A)
>>> segment_tree.query(1, 1, 15, 4, 6)
7
>>> segment_tree.query(1, 1, 15, 7, 11)
14
>>> segment_tree.query(1, 1, 15, 7, 12)
15
"""
if self.flag[idx] is True:
self.segment_tree[idx] = self.lazy[idx]
self.flag[idx] = False
if left_element != right_element:
self.lazy[self.left(idx)] = self.lazy[idx]
self.lazy[self.right(idx)] = self.lazy[idx]
self.flag[self.left(idx)] = True
self.flag[self.right(idx)] = True
if right_element < a or left_element > b:
return -math.inf
if left_element >= a and right_element <= b:
return self.segment_tree[idx]
mid = (left_element + right_element) // 2
q1 = self.query(self.left(idx), left_element, mid, a, b)
q2 = self.query(self.right(idx), mid + 1, right_element, a, b)
return max(q1, q2)
def __str__(self) -> str:
return str([self.query(1, 1, self.size, i, i) for i in range(1, self.size + 1)])
if __name__ == "__main__":
A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8]
size = 15
segt = SegmentTree(size)
segt.build(1, 1, size, A)
print(segt.query(1, 1, size, 4, 6))
print(segt.query(1, 1, size, 7, 11))
print(segt.query(1, 1, size, 7, 12))
segt.update(1, 1, size, 1, 3, 111)
print(segt.query(1, 1, size, 1, 15))
segt.update(1, 1, size, 7, 8, 235)
print(segt)
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Get the citation from google scholar
using title and year of publication, and volume and pages of journal.
"""
import requests
from bs4 import BeautifulSoup
def get_citation(base_url: str, params: dict) -> str:
"""
Return the citation number.
"""
soup = BeautifulSoup(requests.get(base_url, params=params).content, "html.parser")
div = soup.find("div", attrs={"class": "gs_ri"})
anchors = div.find("div", attrs={"class": "gs_fl"}).find_all("a")
return anchors[2].get_text()
if __name__ == "__main__":
params = {
"title": (
"Precisely geometry controlled microsupercapacitors for ultrahigh areal "
"capacitance, volumetric capacitance, and energy density"
),
"journal": "Chem. Mater.",
"volume": 30,
"pages": "3979-3990",
"year": 2018,
"hl": "en",
}
print(get_citation("http://scholar.google.com/scholar_lookup", params=params))
| """
Get the citation from google scholar
using title and year of publication, and volume and pages of journal.
"""
import requests
from bs4 import BeautifulSoup
def get_citation(base_url: str, params: dict) -> str:
"""
Return the citation number.
"""
soup = BeautifulSoup(requests.get(base_url, params=params).content, "html.parser")
div = soup.find("div", attrs={"class": "gs_ri"})
anchors = div.find("div", attrs={"class": "gs_fl"}).find_all("a")
return anchors[2].get_text()
if __name__ == "__main__":
params = {
"title": (
"Precisely geometry controlled microsupercapacitors for ultrahigh areal "
"capacitance, volumetric capacitance, and energy density"
),
"journal": "Chem. Mater.",
"volume": 30,
"pages": "3979-3990",
"year": 2018,
"hl": "en",
}
print(get_citation("http://scholar.google.com/scholar_lookup", params=params))
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | def bin_to_decimal(bin_string: str) -> int:
"""
Convert a binary value to its decimal equivalent
>>> bin_to_decimal("101")
5
>>> bin_to_decimal(" 1010 ")
10
>>> bin_to_decimal("-11101")
-29
>>> bin_to_decimal("0")
0
>>> bin_to_decimal("a")
Traceback (most recent call last):
...
ValueError: Non-binary value was passed to the function
>>> bin_to_decimal("")
Traceback (most recent call last):
...
ValueError: Empty string was passed to the function
>>> bin_to_decimal("39")
Traceback (most recent call last):
...
ValueError: Non-binary value was passed to the function
"""
bin_string = str(bin_string).strip()
if not bin_string:
raise ValueError("Empty string was passed to the function")
is_negative = bin_string[0] == "-"
if is_negative:
bin_string = bin_string[1:]
if not all(char in "01" for char in bin_string):
raise ValueError("Non-binary value was passed to the function")
decimal_number = 0
for char in bin_string:
decimal_number = 2 * decimal_number + int(char)
return -decimal_number if is_negative else decimal_number
if __name__ == "__main__":
from doctest import testmod
testmod()
| def bin_to_decimal(bin_string: str) -> int:
"""
Convert a binary value to its decimal equivalent
>>> bin_to_decimal("101")
5
>>> bin_to_decimal(" 1010 ")
10
>>> bin_to_decimal("-11101")
-29
>>> bin_to_decimal("0")
0
>>> bin_to_decimal("a")
Traceback (most recent call last):
...
ValueError: Non-binary value was passed to the function
>>> bin_to_decimal("")
Traceback (most recent call last):
...
ValueError: Empty string was passed to the function
>>> bin_to_decimal("39")
Traceback (most recent call last):
...
ValueError: Non-binary value was passed to the function
"""
bin_string = str(bin_string).strip()
if not bin_string:
raise ValueError("Empty string was passed to the function")
is_negative = bin_string[0] == "-"
if is_negative:
bin_string = bin_string[1:]
if not all(char in "01" for char in bin_string):
raise ValueError("Non-binary value was passed to the function")
decimal_number = 0
for char in bin_string:
decimal_number = 2 * decimal_number + int(char)
return -decimal_number if is_negative else decimal_number
if __name__ == "__main__":
from doctest import testmod
testmod()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | def apply_table(inp, table):
"""
>>> apply_table("0123456789", list(range(10)))
'9012345678'
>>> apply_table("0123456789", list(range(9, -1, -1)))
'8765432109'
"""
res = ""
for i in table:
res += inp[i - 1]
return res
def left_shift(data):
"""
>>> left_shift("0123456789")
'1234567890'
"""
return data[1:] + data[0]
def XOR(a, b):
"""
>>> XOR("01010101", "00001111")
'01011010'
"""
res = ""
for i in range(len(a)):
if a[i] == b[i]:
res += "0"
else:
res += "1"
return res
def apply_sbox(s, data):
row = int("0b" + data[0] + data[-1], 2)
col = int("0b" + data[1:3], 2)
return bin(s[row][col])[2:]
def function(expansion, s0, s1, key, message):
left = message[:4]
right = message[4:]
temp = apply_table(right, expansion)
temp = XOR(temp, key)
l = apply_sbox(s0, temp[:4]) # noqa: E741
r = apply_sbox(s1, temp[4:])
l = "0" * (2 - len(l)) + l # noqa: E741
r = "0" * (2 - len(r)) + r
temp = apply_table(l + r, p4_table)
temp = XOR(left, temp)
return temp + right
if __name__ == "__main__":
key = input("Enter 10 bit key: ")
message = input("Enter 8 bit message: ")
p8_table = [6, 3, 7, 4, 8, 5, 10, 9]
p10_table = [3, 5, 2, 7, 4, 10, 1, 9, 8, 6]
p4_table = [2, 4, 3, 1]
IP = [2, 6, 3, 1, 4, 8, 5, 7]
IP_inv = [4, 1, 3, 5, 7, 2, 8, 6]
expansion = [4, 1, 2, 3, 2, 3, 4, 1]
s0 = [[1, 0, 3, 2], [3, 2, 1, 0], [0, 2, 1, 3], [3, 1, 3, 2]]
s1 = [[0, 1, 2, 3], [2, 0, 1, 3], [3, 0, 1, 0], [2, 1, 0, 3]]
# key generation
temp = apply_table(key, p10_table)
left = temp[:5]
right = temp[5:]
left = left_shift(left)
right = left_shift(right)
key1 = apply_table(left + right, p8_table)
left = left_shift(left)
right = left_shift(right)
left = left_shift(left)
right = left_shift(right)
key2 = apply_table(left + right, p8_table)
# encryption
temp = apply_table(message, IP)
temp = function(expansion, s0, s1, key1, temp)
temp = temp[4:] + temp[:4]
temp = function(expansion, s0, s1, key2, temp)
CT = apply_table(temp, IP_inv)
print("Cipher text is:", CT)
# decryption
temp = apply_table(CT, IP)
temp = function(expansion, s0, s1, key2, temp)
temp = temp[4:] + temp[:4]
temp = function(expansion, s0, s1, key1, temp)
PT = apply_table(temp, IP_inv)
print("Plain text after decypting is:", PT)
| def apply_table(inp, table):
"""
>>> apply_table("0123456789", list(range(10)))
'9012345678'
>>> apply_table("0123456789", list(range(9, -1, -1)))
'8765432109'
"""
res = ""
for i in table:
res += inp[i - 1]
return res
def left_shift(data):
"""
>>> left_shift("0123456789")
'1234567890'
"""
return data[1:] + data[0]
def XOR(a, b):
"""
>>> XOR("01010101", "00001111")
'01011010'
"""
res = ""
for i in range(len(a)):
if a[i] == b[i]:
res += "0"
else:
res += "1"
return res
def apply_sbox(s, data):
row = int("0b" + data[0] + data[-1], 2)
col = int("0b" + data[1:3], 2)
return bin(s[row][col])[2:]
def function(expansion, s0, s1, key, message):
left = message[:4]
right = message[4:]
temp = apply_table(right, expansion)
temp = XOR(temp, key)
l = apply_sbox(s0, temp[:4]) # noqa: E741
r = apply_sbox(s1, temp[4:])
l = "0" * (2 - len(l)) + l # noqa: E741
r = "0" * (2 - len(r)) + r
temp = apply_table(l + r, p4_table)
temp = XOR(left, temp)
return temp + right
if __name__ == "__main__":
key = input("Enter 10 bit key: ")
message = input("Enter 8 bit message: ")
p8_table = [6, 3, 7, 4, 8, 5, 10, 9]
p10_table = [3, 5, 2, 7, 4, 10, 1, 9, 8, 6]
p4_table = [2, 4, 3, 1]
IP = [2, 6, 3, 1, 4, 8, 5, 7]
IP_inv = [4, 1, 3, 5, 7, 2, 8, 6]
expansion = [4, 1, 2, 3, 2, 3, 4, 1]
s0 = [[1, 0, 3, 2], [3, 2, 1, 0], [0, 2, 1, 3], [3, 1, 3, 2]]
s1 = [[0, 1, 2, 3], [2, 0, 1, 3], [3, 0, 1, 0], [2, 1, 0, 3]]
# key generation
temp = apply_table(key, p10_table)
left = temp[:5]
right = temp[5:]
left = left_shift(left)
right = left_shift(right)
key1 = apply_table(left + right, p8_table)
left = left_shift(left)
right = left_shift(right)
left = left_shift(left)
right = left_shift(right)
key2 = apply_table(left + right, p8_table)
# encryption
temp = apply_table(message, IP)
temp = function(expansion, s0, s1, key1, temp)
temp = temp[4:] + temp[:4]
temp = function(expansion, s0, s1, key2, temp)
CT = apply_table(temp, IP_inv)
print("Cipher text is:", CT)
# decryption
temp = apply_table(CT, IP)
temp = function(expansion, s0, s1, key2, temp)
temp = temp[4:] + temp[:4]
temp = function(expansion, s0, s1, key1, temp)
PT = apply_table(temp, IP_inv)
print("Plain text after decypting is:", PT)
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | from PIL import Image
def change_brightness(img: Image, level: float) -> Image:
"""
Change the brightness of a PIL Image to a given level.
"""
def brightness(c: int) -> float:
"""
Fundamental Transformation/Operation that'll be performed on
every bit.
"""
return 128 + level + (c - 128)
if not -255.0 <= level <= 255.0:
raise ValueError("level must be between -255.0 (black) and 255.0 (white)")
return img.point(brightness)
if __name__ == "__main__":
# Load image
with Image.open("image_data/lena.jpg") as img:
# Change brightness to 100
brigt_img = change_brightness(img, 100)
brigt_img.save("image_data/lena_brightness.png", format="png")
| from PIL import Image
def change_brightness(img: Image, level: float) -> Image:
"""
Change the brightness of a PIL Image to a given level.
"""
def brightness(c: int) -> float:
"""
Fundamental Transformation/Operation that'll be performed on
every bit.
"""
return 128 + level + (c - 128)
if not -255.0 <= level <= 255.0:
raise ValueError("level must be between -255.0 (black) and 255.0 (white)")
return img.point(brightness)
if __name__ == "__main__":
# Load image
with Image.open("image_data/lena.jpg") as img:
# Change brightness to 100
brigt_img = change_brightness(img, 100)
brigt_img.save("image_data/lena_brightness.png", format="png")
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | #!/usr/bin/env python3
"""
Illustrate how to implement bucket sort algorithm.
Author: OMKAR PATHAK
This program will illustrate how to implement bucket sort algorithm
Wikipedia says: Bucket sort, or bin sort, is a sorting algorithm that works
by distributing the elements of an array into a number of buckets.
Each bucket is then sorted individually, either using a different sorting
algorithm, or by recursively applying the bucket sorting algorithm. It is a
distribution sort, and is a cousin of radix sort in the most to least
significant digit flavour.
Bucket sort is a generalization of pigeonhole sort. Bucket sort can be
implemented with comparisons and therefore can also be considered a
comparison sort algorithm. The computational complexity estimates involve the
number of buckets.
Time Complexity of Solution:
Worst case scenario occurs when all the elements are placed in a single bucket.
The overall performance would then be dominated by the algorithm used to sort each
bucket. In this case, O(n log n), because of TimSort
Average Case O(n + (n^2)/k + k), where k is the number of buckets
If k = O(n), time complexity is O(n)
Source: https://en.wikipedia.org/wiki/Bucket_sort
"""
from __future__ import annotations
def bucket_sort(my_list: list) -> list:
"""
>>> data = [-1, 2, -5, 0]
>>> bucket_sort(data) == sorted(data)
True
>>> data = [9, 8, 7, 6, -12]
>>> bucket_sort(data) == sorted(data)
True
>>> data = [.4, 1.2, .1, .2, -.9]
>>> bucket_sort(data) == sorted(data)
True
>>> bucket_sort([]) == sorted([])
True
>>> import random
>>> collection = random.sample(range(-50, 50), 50)
>>> bucket_sort(collection) == sorted(collection)
True
"""
if len(my_list) == 0:
return []
min_value, max_value = min(my_list), max(my_list)
bucket_count = int(max_value - min_value) + 1
buckets: list[list] = [[] for _ in range(bucket_count)]
for i in my_list:
buckets[int(i - min_value)].append(i)
return [v for bucket in buckets for v in sorted(bucket)]
if __name__ == "__main__":
from doctest import testmod
testmod()
assert bucket_sort([4, 5, 3, 2, 1]) == [1, 2, 3, 4, 5]
assert bucket_sort([0, 1, -10, 15, 2, -2]) == [-10, -2, 0, 1, 2, 15]
| #!/usr/bin/env python3
"""
Illustrate how to implement bucket sort algorithm.
Author: OMKAR PATHAK
This program will illustrate how to implement bucket sort algorithm
Wikipedia says: Bucket sort, or bin sort, is a sorting algorithm that works
by distributing the elements of an array into a number of buckets.
Each bucket is then sorted individually, either using a different sorting
algorithm, or by recursively applying the bucket sorting algorithm. It is a
distribution sort, and is a cousin of radix sort in the most to least
significant digit flavour.
Bucket sort is a generalization of pigeonhole sort. Bucket sort can be
implemented with comparisons and therefore can also be considered a
comparison sort algorithm. The computational complexity estimates involve the
number of buckets.
Time Complexity of Solution:
Worst case scenario occurs when all the elements are placed in a single bucket.
The overall performance would then be dominated by the algorithm used to sort each
bucket. In this case, O(n log n), because of TimSort
Average Case O(n + (n^2)/k + k), where k is the number of buckets
If k = O(n), time complexity is O(n)
Source: https://en.wikipedia.org/wiki/Bucket_sort
"""
from __future__ import annotations
def bucket_sort(my_list: list) -> list:
"""
>>> data = [-1, 2, -5, 0]
>>> bucket_sort(data) == sorted(data)
True
>>> data = [9, 8, 7, 6, -12]
>>> bucket_sort(data) == sorted(data)
True
>>> data = [.4, 1.2, .1, .2, -.9]
>>> bucket_sort(data) == sorted(data)
True
>>> bucket_sort([]) == sorted([])
True
>>> import random
>>> collection = random.sample(range(-50, 50), 50)
>>> bucket_sort(collection) == sorted(collection)
True
"""
if len(my_list) == 0:
return []
min_value, max_value = min(my_list), max(my_list)
bucket_count = int(max_value - min_value) + 1
buckets: list[list] = [[] for _ in range(bucket_count)]
for i in my_list:
buckets[int(i - min_value)].append(i)
return [v for bucket in buckets for v in sorted(bucket)]
if __name__ == "__main__":
from doctest import testmod
testmod()
assert bucket_sort([4, 5, 3, 2, 1]) == [1, 2, 3, 4, 5]
assert bucket_sort([0, 1, -10, 15, 2, -2]) == [-10, -2, 0, 1, 2, 15]
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Problem 16: https://projecteuler.net/problem=16
2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26.
What is the sum of the digits of the number 2^1000?
"""
def solution(power: int = 1000) -> int:
"""Returns the sum of the digits of the number 2^power.
>>> solution(1000)
1366
>>> solution(50)
76
>>> solution(20)
31
>>> solution(15)
26
"""
n = 2**power
r = 0
while n:
r, n = r + n % 10, n // 10
return r
if __name__ == "__main__":
print(solution(int(str(input()).strip())))
| """
Problem 16: https://projecteuler.net/problem=16
2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26.
What is the sum of the digits of the number 2^1000?
"""
def solution(power: int = 1000) -> int:
"""Returns the sum of the digits of the number 2^power.
>>> solution(1000)
1366
>>> solution(50)
76
>>> solution(20)
31
>>> solution(15)
26
"""
n = 2**power
r = 0
while n:
r, n = r + n % 10, n // 10
return r
if __name__ == "__main__":
print(solution(int(str(input()).strip())))
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
This is pure Python implementation of linear search algorithm
For doctests run following command:
python3 -m doctest -v linear_search.py
For manual testing run:
python3 linear_search.py
"""
def linear_search(sequence: list, target: int) -> int:
"""A pure Python implementation of a linear search algorithm
:param sequence: a collection with comparable items (as sorted items not required
in Linear Search)
:param target: item value to search
:return: index of found item or None if item is not found
Examples:
>>> linear_search([0, 5, 7, 10, 15], 0)
0
>>> linear_search([0, 5, 7, 10, 15], 15)
4
>>> linear_search([0, 5, 7, 10, 15], 5)
1
>>> linear_search([0, 5, 7, 10, 15], 6)
-1
"""
for index, item in enumerate(sequence):
if item == target:
return index
return -1
def rec_linear_search(sequence: list, low: int, high: int, target: int) -> int:
"""
A pure Python implementation of a recursive linear search algorithm
:param sequence: a collection with comparable items (as sorted items not required
in Linear Search)
:param low: Lower bound of the array
:param high: Higher bound of the array
:param target: The element to be found
:return: Index of the key or -1 if key not found
Examples:
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, 0)
0
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, 700)
4
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, 30)
1
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, -6)
-1
"""
if not (0 <= high < len(sequence) and 0 <= low < len(sequence)):
raise Exception("Invalid upper or lower bound!")
if high < low:
return -1
if sequence[low] == target:
return low
if sequence[high] == target:
return high
return rec_linear_search(sequence, low + 1, high - 1, target)
if __name__ == "__main__":
user_input = input("Enter numbers separated by comma:\n").strip()
sequence = [int(item.strip()) for item in user_input.split(",")]
target = int(input("Enter a single number to be found in the list:\n").strip())
result = linear_search(sequence, target)
if result != -1:
print(f"linear_search({sequence}, {target}) = {result}")
else:
print(f"{target} was not found in {sequence}")
| """
This is pure Python implementation of linear search algorithm
For doctests run following command:
python3 -m doctest -v linear_search.py
For manual testing run:
python3 linear_search.py
"""
def linear_search(sequence: list, target: int) -> int:
"""A pure Python implementation of a linear search algorithm
:param sequence: a collection with comparable items (as sorted items not required
in Linear Search)
:param target: item value to search
:return: index of found item or None if item is not found
Examples:
>>> linear_search([0, 5, 7, 10, 15], 0)
0
>>> linear_search([0, 5, 7, 10, 15], 15)
4
>>> linear_search([0, 5, 7, 10, 15], 5)
1
>>> linear_search([0, 5, 7, 10, 15], 6)
-1
"""
for index, item in enumerate(sequence):
if item == target:
return index
return -1
def rec_linear_search(sequence: list, low: int, high: int, target: int) -> int:
"""
A pure Python implementation of a recursive linear search algorithm
:param sequence: a collection with comparable items (as sorted items not required
in Linear Search)
:param low: Lower bound of the array
:param high: Higher bound of the array
:param target: The element to be found
:return: Index of the key or -1 if key not found
Examples:
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, 0)
0
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, 700)
4
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, 30)
1
>>> rec_linear_search([0, 30, 500, 100, 700], 0, 4, -6)
-1
"""
if not (0 <= high < len(sequence) and 0 <= low < len(sequence)):
raise Exception("Invalid upper or lower bound!")
if high < low:
return -1
if sequence[low] == target:
return low
if sequence[high] == target:
return high
return rec_linear_search(sequence, low + 1, high - 1, target)
if __name__ == "__main__":
user_input = input("Enter numbers separated by comma:\n").strip()
sequence = [int(item.strip()) for item in user_input.split(",")]
target = int(input("Enter a single number to be found in the list:\n").strip())
result = linear_search(sequence, target)
if result != -1:
print(f"linear_search({sequence}, {target}) = {result}")
else:
print(f"{target} was not found in {sequence}")
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
This is an implementation of odd-even transposition sort.
It works by performing a series of parallel swaps between odd and even pairs of
variables in the list.
This implementation represents each variable in the list with a process and
each process communicates with its neighboring processes in the list to perform
comparisons.
They are synchronized with locks and message passing but other forms of
synchronization could be used.
"""
from multiprocessing import Lock, Pipe, Process
# lock used to ensure that two processes do not access a pipe at the same time
processLock = Lock()
"""
The function run by the processes that sorts the list
position = the position in the list the process represents, used to know which
neighbor we pass our value to
value = the initial value at list[position]
LSend, RSend = the pipes we use to send to our left and right neighbors
LRcv, RRcv = the pipes we use to receive from our left and right neighbors
resultPipe = the pipe used to send results back to main
"""
def oeProcess(position, value, LSend, RSend, LRcv, RRcv, resultPipe):
global processLock
# we perform n swaps since after n swaps we know we are sorted
# we *could* stop early if we are sorted already, but it takes as long to
# find out we are sorted as it does to sort the list with this algorithm
for i in range(0, 10):
if (i + position) % 2 == 0 and RSend is not None:
# send your value to your right neighbor
processLock.acquire()
RSend[1].send(value)
processLock.release()
# receive your right neighbor's value
processLock.acquire()
temp = RRcv[0].recv()
processLock.release()
# take the lower value since you are on the left
value = min(value, temp)
elif (i + position) % 2 != 0 and LSend is not None:
# send your value to your left neighbor
processLock.acquire()
LSend[1].send(value)
processLock.release()
# receive your left neighbor's value
processLock.acquire()
temp = LRcv[0].recv()
processLock.release()
# take the higher value since you are on the right
value = max(value, temp)
# after all swaps are performed, send the values back to main
resultPipe[1].send(value)
"""
the function which creates the processes that perform the parallel swaps
arr = the list to be sorted
"""
def OddEvenTransposition(arr):
processArray = []
resultPipe = []
# initialize the list of pipes where the values will be retrieved
for _ in arr:
resultPipe.append(Pipe())
# creates the processes
# the first and last process only have one neighbor so they are made outside
# of the loop
tempRs = Pipe()
tempRr = Pipe()
processArray.append(
Process(
target=oeProcess,
args=(0, arr[0], None, tempRs, None, tempRr, resultPipe[0]),
)
)
tempLr = tempRs
tempLs = tempRr
for i in range(1, len(arr) - 1):
tempRs = Pipe()
tempRr = Pipe()
processArray.append(
Process(
target=oeProcess,
args=(i, arr[i], tempLs, tempRs, tempLr, tempRr, resultPipe[i]),
)
)
tempLr = tempRs
tempLs = tempRr
processArray.append(
Process(
target=oeProcess,
args=(
len(arr) - 1,
arr[len(arr) - 1],
tempLs,
None,
tempLr,
None,
resultPipe[len(arr) - 1],
),
)
)
# start the processes
for p in processArray:
p.start()
# wait for the processes to end and write their values to the list
for p in range(0, len(resultPipe)):
arr[p] = resultPipe[p][0].recv()
processArray[p].join()
return arr
# creates a reverse sorted list and sorts it
def main():
arr = list(range(10, 0, -1))
print("Initial List")
print(*arr)
arr = OddEvenTransposition(arr)
print("Sorted List\n")
print(*arr)
if __name__ == "__main__":
main()
| """
This is an implementation of odd-even transposition sort.
It works by performing a series of parallel swaps between odd and even pairs of
variables in the list.
This implementation represents each variable in the list with a process and
each process communicates with its neighboring processes in the list to perform
comparisons.
They are synchronized with locks and message passing but other forms of
synchronization could be used.
"""
from multiprocessing import Lock, Pipe, Process
# lock used to ensure that two processes do not access a pipe at the same time
processLock = Lock()
"""
The function run by the processes that sorts the list
position = the position in the list the process represents, used to know which
neighbor we pass our value to
value = the initial value at list[position]
LSend, RSend = the pipes we use to send to our left and right neighbors
LRcv, RRcv = the pipes we use to receive from our left and right neighbors
resultPipe = the pipe used to send results back to main
"""
def oeProcess(position, value, LSend, RSend, LRcv, RRcv, resultPipe):
global processLock
# we perform n swaps since after n swaps we know we are sorted
# we *could* stop early if we are sorted already, but it takes as long to
# find out we are sorted as it does to sort the list with this algorithm
for i in range(0, 10):
if (i + position) % 2 == 0 and RSend is not None:
# send your value to your right neighbor
processLock.acquire()
RSend[1].send(value)
processLock.release()
# receive your right neighbor's value
processLock.acquire()
temp = RRcv[0].recv()
processLock.release()
# take the lower value since you are on the left
value = min(value, temp)
elif (i + position) % 2 != 0 and LSend is not None:
# send your value to your left neighbor
processLock.acquire()
LSend[1].send(value)
processLock.release()
# receive your left neighbor's value
processLock.acquire()
temp = LRcv[0].recv()
processLock.release()
# take the higher value since you are on the right
value = max(value, temp)
# after all swaps are performed, send the values back to main
resultPipe[1].send(value)
"""
the function which creates the processes that perform the parallel swaps
arr = the list to be sorted
"""
def OddEvenTransposition(arr):
processArray = []
resultPipe = []
# initialize the list of pipes where the values will be retrieved
for _ in arr:
resultPipe.append(Pipe())
# creates the processes
# the first and last process only have one neighbor so they are made outside
# of the loop
tempRs = Pipe()
tempRr = Pipe()
processArray.append(
Process(
target=oeProcess,
args=(0, arr[0], None, tempRs, None, tempRr, resultPipe[0]),
)
)
tempLr = tempRs
tempLs = tempRr
for i in range(1, len(arr) - 1):
tempRs = Pipe()
tempRr = Pipe()
processArray.append(
Process(
target=oeProcess,
args=(i, arr[i], tempLs, tempRs, tempLr, tempRr, resultPipe[i]),
)
)
tempLr = tempRs
tempLs = tempRr
processArray.append(
Process(
target=oeProcess,
args=(
len(arr) - 1,
arr[len(arr) - 1],
tempLs,
None,
tempLr,
None,
resultPipe[len(arr) - 1],
),
)
)
# start the processes
for p in processArray:
p.start()
# wait for the processes to end and write their values to the list
for p in range(0, len(resultPipe)):
arr[p] = resultPipe[p][0].recv()
processArray[p].join()
return arr
# creates a reverse sorted list and sorts it
def main():
arr = list(range(10, 0, -1))
print("Initial List")
print(*arr)
arr = OddEvenTransposition(arr)
print("Sorted List\n")
print(*arr)
if __name__ == "__main__":
main()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
The algorithm finds distance between closest pair of points
in the given n points.
Approach used -> Divide and conquer
The points are sorted based on Xco-ords and
then based on Yco-ords separately.
And by applying divide and conquer approach,
minimum distance is obtained recursively.
>> Closest points can lie on different sides of partition.
This case handled by forming a strip of points
whose Xco-ords distance is less than closest_pair_dis
from mid-point's Xco-ords. Points sorted based on Yco-ords
are used in this step to reduce sorting time.
Closest pair distance is found in the strip of points. (closest_in_strip)
min(closest_pair_dis, closest_in_strip) would be the final answer.
Time complexity: O(n * log n)
"""
def euclidean_distance_sqr(point1, point2):
"""
>>> euclidean_distance_sqr([1,2],[2,4])
5
"""
return (point1[0] - point2[0]) ** 2 + (point1[1] - point2[1]) ** 2
def column_based_sort(array, column=0):
"""
>>> column_based_sort([(5, 1), (4, 2), (3, 0)], 1)
[(3, 0), (5, 1), (4, 2)]
"""
return sorted(array, key=lambda x: x[column])
def dis_between_closest_pair(points, points_counts, min_dis=float("inf")):
"""
brute force approach to find distance between closest pair points
Parameters :
points, points_count, min_dis (list(tuple(int, int)), int, int)
Returns :
min_dis (float): distance between closest pair of points
>>> dis_between_closest_pair([[1,2],[2,4],[5,7],[8,9],[11,0]],5)
5
"""
for i in range(points_counts - 1):
for j in range(i + 1, points_counts):
current_dis = euclidean_distance_sqr(points[i], points[j])
if current_dis < min_dis:
min_dis = current_dis
return min_dis
def dis_between_closest_in_strip(points, points_counts, min_dis=float("inf")):
"""
closest pair of points in strip
Parameters :
points, points_count, min_dis (list(tuple(int, int)), int, int)
Returns :
min_dis (float): distance btw closest pair of points in the strip (< min_dis)
>>> dis_between_closest_in_strip([[1,2],[2,4],[5,7],[8,9],[11,0]],5)
85
"""
for i in range(min(6, points_counts - 1), points_counts):
for j in range(max(0, i - 6), i):
current_dis = euclidean_distance_sqr(points[i], points[j])
if current_dis < min_dis:
min_dis = current_dis
return min_dis
def closest_pair_of_points_sqr(points_sorted_on_x, points_sorted_on_y, points_counts):
"""divide and conquer approach
Parameters :
points, points_count (list(tuple(int, int)), int)
Returns :
(float): distance btw closest pair of points
>>> closest_pair_of_points_sqr([(1, 2), (3, 4)], [(5, 6), (7, 8)], 2)
8
"""
# base case
if points_counts <= 3:
return dis_between_closest_pair(points_sorted_on_x, points_counts)
# recursion
mid = points_counts // 2
closest_in_left = closest_pair_of_points_sqr(
points_sorted_on_x, points_sorted_on_y[:mid], mid
)
closest_in_right = closest_pair_of_points_sqr(
points_sorted_on_y, points_sorted_on_y[mid:], points_counts - mid
)
closest_pair_dis = min(closest_in_left, closest_in_right)
"""
cross_strip contains the points, whose Xcoords are at a
distance(< closest_pair_dis) from mid's Xcoord
"""
cross_strip = []
for point in points_sorted_on_x:
if abs(point[0] - points_sorted_on_x[mid][0]) < closest_pair_dis:
cross_strip.append(point)
closest_in_strip = dis_between_closest_in_strip(
cross_strip, len(cross_strip), closest_pair_dis
)
return min(closest_pair_dis, closest_in_strip)
def closest_pair_of_points(points, points_counts):
"""
>>> closest_pair_of_points([(2, 3), (12, 30)], len([(2, 3), (12, 30)]))
28.792360097775937
"""
points_sorted_on_x = column_based_sort(points, column=0)
points_sorted_on_y = column_based_sort(points, column=1)
return (
closest_pair_of_points_sqr(
points_sorted_on_x, points_sorted_on_y, points_counts
)
) ** 0.5
if __name__ == "__main__":
points = [(2, 3), (12, 30), (40, 50), (5, 1), (12, 10), (3, 4)]
print("Distance:", closest_pair_of_points(points, len(points)))
| """
The algorithm finds distance between closest pair of points
in the given n points.
Approach used -> Divide and conquer
The points are sorted based on Xco-ords and
then based on Yco-ords separately.
And by applying divide and conquer approach,
minimum distance is obtained recursively.
>> Closest points can lie on different sides of partition.
This case handled by forming a strip of points
whose Xco-ords distance is less than closest_pair_dis
from mid-point's Xco-ords. Points sorted based on Yco-ords
are used in this step to reduce sorting time.
Closest pair distance is found in the strip of points. (closest_in_strip)
min(closest_pair_dis, closest_in_strip) would be the final answer.
Time complexity: O(n * log n)
"""
def euclidean_distance_sqr(point1, point2):
"""
>>> euclidean_distance_sqr([1,2],[2,4])
5
"""
return (point1[0] - point2[0]) ** 2 + (point1[1] - point2[1]) ** 2
def column_based_sort(array, column=0):
"""
>>> column_based_sort([(5, 1), (4, 2), (3, 0)], 1)
[(3, 0), (5, 1), (4, 2)]
"""
return sorted(array, key=lambda x: x[column])
def dis_between_closest_pair(points, points_counts, min_dis=float("inf")):
"""
brute force approach to find distance between closest pair points
Parameters :
points, points_count, min_dis (list(tuple(int, int)), int, int)
Returns :
min_dis (float): distance between closest pair of points
>>> dis_between_closest_pair([[1,2],[2,4],[5,7],[8,9],[11,0]],5)
5
"""
for i in range(points_counts - 1):
for j in range(i + 1, points_counts):
current_dis = euclidean_distance_sqr(points[i], points[j])
if current_dis < min_dis:
min_dis = current_dis
return min_dis
def dis_between_closest_in_strip(points, points_counts, min_dis=float("inf")):
"""
closest pair of points in strip
Parameters :
points, points_count, min_dis (list(tuple(int, int)), int, int)
Returns :
min_dis (float): distance btw closest pair of points in the strip (< min_dis)
>>> dis_between_closest_in_strip([[1,2],[2,4],[5,7],[8,9],[11,0]],5)
85
"""
for i in range(min(6, points_counts - 1), points_counts):
for j in range(max(0, i - 6), i):
current_dis = euclidean_distance_sqr(points[i], points[j])
if current_dis < min_dis:
min_dis = current_dis
return min_dis
def closest_pair_of_points_sqr(points_sorted_on_x, points_sorted_on_y, points_counts):
"""divide and conquer approach
Parameters :
points, points_count (list(tuple(int, int)), int)
Returns :
(float): distance btw closest pair of points
>>> closest_pair_of_points_sqr([(1, 2), (3, 4)], [(5, 6), (7, 8)], 2)
8
"""
# base case
if points_counts <= 3:
return dis_between_closest_pair(points_sorted_on_x, points_counts)
# recursion
mid = points_counts // 2
closest_in_left = closest_pair_of_points_sqr(
points_sorted_on_x, points_sorted_on_y[:mid], mid
)
closest_in_right = closest_pair_of_points_sqr(
points_sorted_on_y, points_sorted_on_y[mid:], points_counts - mid
)
closest_pair_dis = min(closest_in_left, closest_in_right)
"""
cross_strip contains the points, whose Xcoords are at a
distance(< closest_pair_dis) from mid's Xcoord
"""
cross_strip = []
for point in points_sorted_on_x:
if abs(point[0] - points_sorted_on_x[mid][0]) < closest_pair_dis:
cross_strip.append(point)
closest_in_strip = dis_between_closest_in_strip(
cross_strip, len(cross_strip), closest_pair_dis
)
return min(closest_pair_dis, closest_in_strip)
def closest_pair_of_points(points, points_counts):
"""
>>> closest_pair_of_points([(2, 3), (12, 30)], len([(2, 3), (12, 30)]))
28.792360097775937
"""
points_sorted_on_x = column_based_sort(points, column=0)
points_sorted_on_y = column_based_sort(points, column=1)
return (
closest_pair_of_points_sqr(
points_sorted_on_x, points_sorted_on_y, points_counts
)
) ** 0.5
if __name__ == "__main__":
points = [(2, 3), (12, 30), (40, 50), (5, 1), (12, 10), (3, 4)]
print("Distance:", closest_pair_of_points(points, len(points)))
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | import math
def perfect_square(num: int) -> bool:
"""
Check if a number is perfect square number or not
:param num: the number to be checked
:return: True if number is square number, otherwise False
>>> perfect_square(9)
True
>>> perfect_square(16)
True
>>> perfect_square(1)
True
>>> perfect_square(0)
True
>>> perfect_square(10)
False
"""
return math.sqrt(num) * math.sqrt(num) == num
def perfect_square_binary_search(n: int) -> bool:
"""
Check if a number is perfect square using binary search.
Time complexity : O(Log(n))
Space complexity: O(1)
>>> perfect_square_binary_search(9)
True
>>> perfect_square_binary_search(16)
True
>>> perfect_square_binary_search(1)
True
>>> perfect_square_binary_search(0)
True
>>> perfect_square_binary_search(10)
False
>>> perfect_square_binary_search(-1)
False
>>> perfect_square_binary_search(1.1)
False
>>> perfect_square_binary_search("a")
Traceback (most recent call last):
...
TypeError: '<=' not supported between instances of 'int' and 'str'
>>> perfect_square_binary_search(None)
Traceback (most recent call last):
...
TypeError: '<=' not supported between instances of 'int' and 'NoneType'
>>> perfect_square_binary_search([])
Traceback (most recent call last):
...
TypeError: '<=' not supported between instances of 'int' and 'list'
"""
left = 0
right = n
while left <= right:
mid = (left + right) // 2
if mid**2 == n:
return True
elif mid**2 > n:
right = mid - 1
else:
left = mid + 1
return False
if __name__ == "__main__":
import doctest
doctest.testmod()
| import math
def perfect_square(num: int) -> bool:
"""
Check if a number is perfect square number or not
:param num: the number to be checked
:return: True if number is square number, otherwise False
>>> perfect_square(9)
True
>>> perfect_square(16)
True
>>> perfect_square(1)
True
>>> perfect_square(0)
True
>>> perfect_square(10)
False
"""
return math.sqrt(num) * math.sqrt(num) == num
def perfect_square_binary_search(n: int) -> bool:
"""
Check if a number is perfect square using binary search.
Time complexity : O(Log(n))
Space complexity: O(1)
>>> perfect_square_binary_search(9)
True
>>> perfect_square_binary_search(16)
True
>>> perfect_square_binary_search(1)
True
>>> perfect_square_binary_search(0)
True
>>> perfect_square_binary_search(10)
False
>>> perfect_square_binary_search(-1)
False
>>> perfect_square_binary_search(1.1)
False
>>> perfect_square_binary_search("a")
Traceback (most recent call last):
...
TypeError: '<=' not supported between instances of 'int' and 'str'
>>> perfect_square_binary_search(None)
Traceback (most recent call last):
...
TypeError: '<=' not supported between instances of 'int' and 'NoneType'
>>> perfect_square_binary_search([])
Traceback (most recent call last):
...
TypeError: '<=' not supported between instances of 'int' and 'list'
"""
left = 0
right = n
while left <= right:
mid = (left + right) // 2
if mid**2 == n:
return True
elif mid**2 > n:
right = mid - 1
else:
left = mid + 1
return False
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | alphabets = [chr(i) for i in range(32, 126)]
gear_one = [i for i in range(len(alphabets))]
gear_two = [i for i in range(len(alphabets))]
gear_three = [i for i in range(len(alphabets))]
reflector = [i for i in reversed(range(len(alphabets)))]
code = []
gear_one_pos = gear_two_pos = gear_three_pos = 0
def rotator():
global gear_one_pos
global gear_two_pos
global gear_three_pos
i = gear_one[0]
gear_one.append(i)
del gear_one[0]
gear_one_pos += 1
if gear_one_pos % int(len(alphabets)) == 0:
i = gear_two[0]
gear_two.append(i)
del gear_two[0]
gear_two_pos += 1
if gear_two_pos % int(len(alphabets)) == 0:
i = gear_three[0]
gear_three.append(i)
del gear_three[0]
gear_three_pos += 1
def engine(input_character):
target = alphabets.index(input_character)
target = gear_one[target]
target = gear_two[target]
target = gear_three[target]
target = reflector[target]
target = gear_three.index(target)
target = gear_two.index(target)
target = gear_one.index(target)
code.append(alphabets[target])
rotator()
if __name__ == "__main__":
decode = list(input("Type your message:\n"))
while True:
try:
token = int(input("Please set token:(must be only digits)\n"))
break
except Exception as error:
print(error)
for i in range(token):
rotator()
for j in decode:
engine(j)
print("\n" + "".join(code))
print(
f"\nYour Token is {token} please write it down.\nIf you want to decode "
"this message again you should input same digits as token!"
)
| alphabets = [chr(i) for i in range(32, 126)]
gear_one = [i for i in range(len(alphabets))]
gear_two = [i for i in range(len(alphabets))]
gear_three = [i for i in range(len(alphabets))]
reflector = [i for i in reversed(range(len(alphabets)))]
code = []
gear_one_pos = gear_two_pos = gear_three_pos = 0
def rotator():
global gear_one_pos
global gear_two_pos
global gear_three_pos
i = gear_one[0]
gear_one.append(i)
del gear_one[0]
gear_one_pos += 1
if gear_one_pos % int(len(alphabets)) == 0:
i = gear_two[0]
gear_two.append(i)
del gear_two[0]
gear_two_pos += 1
if gear_two_pos % int(len(alphabets)) == 0:
i = gear_three[0]
gear_three.append(i)
del gear_three[0]
gear_three_pos += 1
def engine(input_character):
target = alphabets.index(input_character)
target = gear_one[target]
target = gear_two[target]
target = gear_three[target]
target = reflector[target]
target = gear_three.index(target)
target = gear_two.index(target)
target = gear_one.index(target)
code.append(alphabets[target])
rotator()
if __name__ == "__main__":
decode = list(input("Type your message:\n"))
while True:
try:
token = int(input("Please set token:(must be only digits)\n"))
break
except Exception as error:
print(error)
for i in range(token):
rotator()
for j in decode:
engine(j)
print("\n" + "".join(code))
print(
f"\nYour Token is {token} please write it down.\nIf you want to decode "
"this message again you should input same digits as token!"
)
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | #!/bin/sh
#
# Copyright (c) 2006, 2008 Junio C Hamano
#
# The "pre-rebase" hook is run just before "git rebase" starts doing
# its job, and can prevent the command from running by exiting with
# non-zero status.
#
# The hook is called with the following parameters:
#
# $1 -- the upstream the series was forked from.
# $2 -- the branch being rebased (or empty when rebasing the current branch).
#
# This sample shows how to prevent topic branches that are already
# merged to 'next' branch from getting rebased, because allowing it
# would result in rebasing already published history.
publish=next
basebranch="$1"
if test "$#" = 2
then
topic="refs/heads/$2"
else
topic=`git symbolic-ref HEAD` ||
exit 0 ;# we do not interrupt rebasing detached HEAD
fi
case "$topic" in
refs/heads/??/*)
;;
*)
exit 0 ;# we do not interrupt others.
;;
esac
# Now we are dealing with a topic branch being rebased
# on top of master. Is it OK to rebase it?
# Does the topic really exist?
git show-ref -q "$topic" || {
echo >&2 "No such branch $topic"
exit 1
}
# Is topic fully merged to master?
not_in_master=`git rev-list --pretty=oneline ^master "$topic"`
if test -z "$not_in_master"
then
echo >&2 "$topic is fully merged to master; better remove it."
exit 1 ;# we could allow it, but there is no point.
fi
# Is topic ever merged to next? If so you should not be rebasing it.
only_next_1=`git rev-list ^master "^$topic" ${publish} | sort`
only_next_2=`git rev-list ^master ${publish} | sort`
if test "$only_next_1" = "$only_next_2"
then
not_in_topic=`git rev-list "^$topic" master`
if test -z "$not_in_topic"
then
echo >&2 "$topic is already up to date with master"
exit 1 ;# we could allow it, but there is no point.
else
exit 0
fi
else
not_in_next=`git rev-list --pretty=oneline ^${publish} "$topic"`
/usr/bin/perl -e '
my $topic = $ARGV[0];
my $msg = "* $topic has commits already merged to public branch:\n";
my (%not_in_next) = map {
/^([0-9a-f]+) /;
($1 => 1);
} split(/\n/, $ARGV[1]);
for my $elem (map {
/^([0-9a-f]+) (.*)$/;
[$1 => $2];
} split(/\n/, $ARGV[2])) {
if (!exists $not_in_next{$elem->[0]}) {
if ($msg) {
print STDERR $msg;
undef $msg;
}
print STDERR " $elem->[1]\n";
}
}
' "$topic" "$not_in_next" "$not_in_master"
exit 1
fi
<<\DOC_END
This sample hook safeguards topic branches that have been
published from being rewound.
The workflow assumed here is:
* Once a topic branch forks from "master", "master" is never
merged into it again (either directly or indirectly).
* Once a topic branch is fully cooked and merged into "master",
it is deleted. If you need to build on top of it to correct
earlier mistakes, a new topic branch is created by forking at
the tip of the "master". This is not strictly necessary, but
it makes it easier to keep your history simple.
* Whenever you need to test or publish your changes to topic
branches, merge them into "next" branch.
The script, being an example, hardcodes the publish branch name
to be "next", but it is trivial to make it configurable via
$GIT_DIR/config mechanism.
With this workflow, you would want to know:
(1) ... if a topic branch has ever been merged to "next". Young
topic branches can have stupid mistakes you would rather
clean up before publishing, and things that have not been
merged into other branches can be easily rebased without
affecting other people. But once it is published, you would
not want to rewind it.
(2) ... if a topic branch has been fully merged to "master".
Then you can delete it. More importantly, you should not
build on top of it -- other people may already want to
change things related to the topic as patches against your
"master", so if you need further changes, it is better to
fork the topic (perhaps with the same name) afresh from the
tip of "master".
Let's look at this example:
o---o---o---o---o---o---o---o---o---o "next"
/ / / /
/ a---a---b A / /
/ / / /
/ / c---c---c---c B /
/ / / \ /
/ / / b---b C \ /
/ / / / \ /
---o---o---o---o---o---o---o---o---o---o---o "master"
A, B and C are topic branches.
* A has one fix since it was merged up to "next".
* B has finished. It has been fully merged up to "master" and "next",
and is ready to be deleted.
* C has not merged to "next" at all.
We would want to allow C to be rebased, refuse A, and encourage
B to be deleted.
To compute (1):
git rev-list ^master ^topic next
git rev-list ^master next
if these match, topic has not merged in next at all.
To compute (2):
git rev-list master..topic
if this is empty, it is fully merged to "master".
DOC_END
| #!/bin/sh
#
# Copyright (c) 2006, 2008 Junio C Hamano
#
# The "pre-rebase" hook is run just before "git rebase" starts doing
# its job, and can prevent the command from running by exiting with
# non-zero status.
#
# The hook is called with the following parameters:
#
# $1 -- the upstream the series was forked from.
# $2 -- the branch being rebased (or empty when rebasing the current branch).
#
# This sample shows how to prevent topic branches that are already
# merged to 'next' branch from getting rebased, because allowing it
# would result in rebasing already published history.
publish=next
basebranch="$1"
if test "$#" = 2
then
topic="refs/heads/$2"
else
topic=`git symbolic-ref HEAD` ||
exit 0 ;# we do not interrupt rebasing detached HEAD
fi
case "$topic" in
refs/heads/??/*)
;;
*)
exit 0 ;# we do not interrupt others.
;;
esac
# Now we are dealing with a topic branch being rebased
# on top of master. Is it OK to rebase it?
# Does the topic really exist?
git show-ref -q "$topic" || {
echo >&2 "No such branch $topic"
exit 1
}
# Is topic fully merged to master?
not_in_master=`git rev-list --pretty=oneline ^master "$topic"`
if test -z "$not_in_master"
then
echo >&2 "$topic is fully merged to master; better remove it."
exit 1 ;# we could allow it, but there is no point.
fi
# Is topic ever merged to next? If so you should not be rebasing it.
only_next_1=`git rev-list ^master "^$topic" ${publish} | sort`
only_next_2=`git rev-list ^master ${publish} | sort`
if test "$only_next_1" = "$only_next_2"
then
not_in_topic=`git rev-list "^$topic" master`
if test -z "$not_in_topic"
then
echo >&2 "$topic is already up to date with master"
exit 1 ;# we could allow it, but there is no point.
else
exit 0
fi
else
not_in_next=`git rev-list --pretty=oneline ^${publish} "$topic"`
/usr/bin/perl -e '
my $topic = $ARGV[0];
my $msg = "* $topic has commits already merged to public branch:\n";
my (%not_in_next) = map {
/^([0-9a-f]+) /;
($1 => 1);
} split(/\n/, $ARGV[1]);
for my $elem (map {
/^([0-9a-f]+) (.*)$/;
[$1 => $2];
} split(/\n/, $ARGV[2])) {
if (!exists $not_in_next{$elem->[0]}) {
if ($msg) {
print STDERR $msg;
undef $msg;
}
print STDERR " $elem->[1]\n";
}
}
' "$topic" "$not_in_next" "$not_in_master"
exit 1
fi
<<\DOC_END
This sample hook safeguards topic branches that have been
published from being rewound.
The workflow assumed here is:
* Once a topic branch forks from "master", "master" is never
merged into it again (either directly or indirectly).
* Once a topic branch is fully cooked and merged into "master",
it is deleted. If you need to build on top of it to correct
earlier mistakes, a new topic branch is created by forking at
the tip of the "master". This is not strictly necessary, but
it makes it easier to keep your history simple.
* Whenever you need to test or publish your changes to topic
branches, merge them into "next" branch.
The script, being an example, hardcodes the publish branch name
to be "next", but it is trivial to make it configurable via
$GIT_DIR/config mechanism.
With this workflow, you would want to know:
(1) ... if a topic branch has ever been merged to "next". Young
topic branches can have stupid mistakes you would rather
clean up before publishing, and things that have not been
merged into other branches can be easily rebased without
affecting other people. But once it is published, you would
not want to rewind it.
(2) ... if a topic branch has been fully merged to "master".
Then you can delete it. More importantly, you should not
build on top of it -- other people may already want to
change things related to the topic as patches against your
"master", so if you need further changes, it is better to
fork the topic (perhaps with the same name) afresh from the
tip of "master".
Let's look at this example:
o---o---o---o---o---o---o---o---o---o "next"
/ / / /
/ a---a---b A / /
/ / / /
/ / c---c---c---c B /
/ / / \ /
/ / / b---b C \ /
/ / / / \ /
---o---o---o---o---o---o---o---o---o---o---o "master"
A, B and C are topic branches.
* A has one fix since it was merged up to "next".
* B has finished. It has been fully merged up to "master" and "next",
and is ready to be deleted.
* C has not merged to "next" at all.
We would want to allow C to be rebased, refuse A, and encourage
B to be deleted.
To compute (1):
git rev-list ^master ^topic next
git rev-list ^master next
if these match, topic has not merged in next at all.
To compute (2):
git rev-list master..topic
if this is empty, it is fully merged to "master".
DOC_END
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | -1 |
||
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | import math
class SegmentTree:
def __init__(self, A):
self.N = len(A)
self.st = [0] * (
4 * self.N
) # approximate the overall size of segment tree with array N
self.build(1, 0, self.N - 1)
def left(self, idx):
return idx * 2
def right(self, idx):
return idx * 2 + 1
def build(self, idx, l, r): # noqa: E741
if l == r: # noqa: E741
self.st[idx] = A[l]
else:
mid = (l + r) // 2
self.build(self.left(idx), l, mid)
self.build(self.right(idx), mid + 1, r)
self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)])
def update(self, a, b, val):
return self.update_recursive(1, 0, self.N - 1, a - 1, b - 1, val)
def update_recursive(self, idx, l, r, a, b, val): # noqa: E741
"""
update(1, 1, N, a, b, v) for update val v to [a,b]
"""
if r < a or l > b:
return True
if l == r: # noqa: E741
self.st[idx] = val
return True
mid = (l + r) // 2
self.update_recursive(self.left(idx), l, mid, a, b, val)
self.update_recursive(self.right(idx), mid + 1, r, a, b, val)
self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)])
return True
def query(self, a, b):
return self.query_recursive(1, 0, self.N - 1, a - 1, b - 1)
def query_recursive(self, idx, l, r, a, b): # noqa: E741
"""
query(1, 1, N, a, b) for query max of [a,b]
"""
if r < a or l > b:
return -math.inf
if l >= a and r <= b: # noqa: E741
return self.st[idx]
mid = (l + r) // 2
q1 = self.query_recursive(self.left(idx), l, mid, a, b)
q2 = self.query_recursive(self.right(idx), mid + 1, r, a, b)
return max(q1, q2)
def showData(self):
showList = []
for i in range(1, N + 1):
showList += [self.query(i, i)]
print(showList)
if __name__ == "__main__":
A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8]
N = 15
segt = SegmentTree(A)
print(segt.query(4, 6))
print(segt.query(7, 11))
print(segt.query(7, 12))
segt.update(1, 3, 111)
print(segt.query(1, 15))
segt.update(7, 8, 235)
segt.showData()
| import math
class SegmentTree:
def __init__(self, A):
self.N = len(A)
self.st = [0] * (
4 * self.N
) # approximate the overall size of segment tree with array N
self.build(1, 0, self.N - 1)
def left(self, idx):
return idx * 2
def right(self, idx):
return idx * 2 + 1
def build(self, idx, l, r): # noqa: E741
if l == r: # noqa: E741
self.st[idx] = A[l]
else:
mid = (l + r) // 2
self.build(self.left(idx), l, mid)
self.build(self.right(idx), mid + 1, r)
self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)])
def update(self, a, b, val):
return self.update_recursive(1, 0, self.N - 1, a - 1, b - 1, val)
def update_recursive(self, idx, l, r, a, b, val): # noqa: E741
"""
update(1, 1, N, a, b, v) for update val v to [a,b]
"""
if r < a or l > b:
return True
if l == r: # noqa: E741
self.st[idx] = val
return True
mid = (l + r) // 2
self.update_recursive(self.left(idx), l, mid, a, b, val)
self.update_recursive(self.right(idx), mid + 1, r, a, b, val)
self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)])
return True
def query(self, a, b):
return self.query_recursive(1, 0, self.N - 1, a - 1, b - 1)
def query_recursive(self, idx, l, r, a, b): # noqa: E741
"""
query(1, 1, N, a, b) for query max of [a,b]
"""
if r < a or l > b:
return -math.inf
if l >= a and r <= b: # noqa: E741
return self.st[idx]
mid = (l + r) // 2
q1 = self.query_recursive(self.left(idx), l, mid, a, b)
q2 = self.query_recursive(self.right(idx), mid + 1, r, a, b)
return max(q1, q2)
def showData(self):
showList = []
for i in range(1, N + 1):
showList += [self.query(i, i)]
print(showList)
if __name__ == "__main__":
A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8]
N = 15
segt = SegmentTree(A)
print(segt.query(4, 6))
print(segt.query(7, 11))
print(segt.query(7, 12))
segt.update(1, 3, 111)
print(segt.query(1, 15))
segt.update(7, 8, 235)
segt.showData()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | -1 |
||
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | import argparse
import datetime
def zeller(date_input: str) -> str:
"""
Zellers Congruence Algorithm
Find the day of the week for nearly any Gregorian or Julian calendar date
>>> zeller('01-31-2010')
'Your date 01-31-2010, is a Sunday!'
Validate out of range month
>>> zeller('13-31-2010')
Traceback (most recent call last):
...
ValueError: Month must be between 1 - 12
>>> zeller('.2-31-2010')
Traceback (most recent call last):
...
ValueError: invalid literal for int() with base 10: '.2'
Validate out of range date:
>>> zeller('01-33-2010')
Traceback (most recent call last):
...
ValueError: Date must be between 1 - 31
>>> zeller('01-.4-2010')
Traceback (most recent call last):
...
ValueError: invalid literal for int() with base 10: '.4'
Validate second separator:
>>> zeller('01-31*2010')
Traceback (most recent call last):
...
ValueError: Date separator must be '-' or '/'
Validate first separator:
>>> zeller('01^31-2010')
Traceback (most recent call last):
...
ValueError: Date separator must be '-' or '/'
Validate out of range year:
>>> zeller('01-31-8999')
Traceback (most recent call last):
...
ValueError: Year out of range. There has to be some sort of limit...right?
Test null input:
>>> zeller()
Traceback (most recent call last):
...
TypeError: zeller() missing 1 required positional argument: 'date_input'
Test length of date_input:
>>> zeller('')
Traceback (most recent call last):
...
ValueError: Must be 10 characters long
>>> zeller('01-31-19082939')
Traceback (most recent call last):
...
ValueError: Must be 10 characters long"""
# Days of the week for response
days = {
"0": "Sunday",
"1": "Monday",
"2": "Tuesday",
"3": "Wednesday",
"4": "Thursday",
"5": "Friday",
"6": "Saturday",
}
convert_datetime_days = {0: 1, 1: 2, 2: 3, 3: 4, 4: 5, 5: 6, 6: 0}
# Validate
if not 0 < len(date_input) < 11:
raise ValueError("Must be 10 characters long")
# Get month
m: int = int(date_input[0] + date_input[1])
# Validate
if not 0 < m < 13:
raise ValueError("Month must be between 1 - 12")
sep_1: str = date_input[2]
# Validate
if sep_1 not in ["-", "/"]:
raise ValueError("Date separator must be '-' or '/'")
# Get day
d: int = int(date_input[3] + date_input[4])
# Validate
if not 0 < d < 32:
raise ValueError("Date must be between 1 - 31")
# Get second separator
sep_2: str = date_input[5]
# Validate
if sep_2 not in ["-", "/"]:
raise ValueError("Date separator must be '-' or '/'")
# Get year
y: int = int(date_input[6] + date_input[7] + date_input[8] + date_input[9])
# Arbitrary year range
if not 45 < y < 8500:
raise ValueError(
"Year out of range. There has to be some sort of limit...right?"
)
# Get datetime obj for validation
dt_ck = datetime.date(int(y), int(m), int(d))
# Start math
if m <= 2:
y = y - 1
m = m + 12
# maths var
c: int = int(str(y)[:2])
k: int = int(str(y)[2:])
t: int = int(2.6 * m - 5.39)
u: int = int(c / 4)
v: int = int(k / 4)
x: int = int(d + k)
z: int = int(t + u + v + x)
w: int = int(z - (2 * c))
f: int = round(w % 7)
# End math
# Validate math
if f != convert_datetime_days[dt_ck.weekday()]:
raise AssertionError("The date was evaluated incorrectly. Contact developer.")
# Response
response: str = f"Your date {date_input}, is a {days[str(f)]}!"
return response
if __name__ == "__main__":
import doctest
doctest.testmod()
parser = argparse.ArgumentParser(
description=(
"Find out what day of the week nearly any date is or was. Enter "
"date as a string in the mm-dd-yyyy or mm/dd/yyyy format"
)
)
parser.add_argument(
"date_input", type=str, help="Date as a string (mm-dd-yyyy or mm/dd/yyyy)"
)
args = parser.parse_args()
zeller(args.date_input)
| import argparse
import datetime
def zeller(date_input: str) -> str:
"""
Zellers Congruence Algorithm
Find the day of the week for nearly any Gregorian or Julian calendar date
>>> zeller('01-31-2010')
'Your date 01-31-2010, is a Sunday!'
Validate out of range month
>>> zeller('13-31-2010')
Traceback (most recent call last):
...
ValueError: Month must be between 1 - 12
>>> zeller('.2-31-2010')
Traceback (most recent call last):
...
ValueError: invalid literal for int() with base 10: '.2'
Validate out of range date:
>>> zeller('01-33-2010')
Traceback (most recent call last):
...
ValueError: Date must be between 1 - 31
>>> zeller('01-.4-2010')
Traceback (most recent call last):
...
ValueError: invalid literal for int() with base 10: '.4'
Validate second separator:
>>> zeller('01-31*2010')
Traceback (most recent call last):
...
ValueError: Date separator must be '-' or '/'
Validate first separator:
>>> zeller('01^31-2010')
Traceback (most recent call last):
...
ValueError: Date separator must be '-' or '/'
Validate out of range year:
>>> zeller('01-31-8999')
Traceback (most recent call last):
...
ValueError: Year out of range. There has to be some sort of limit...right?
Test null input:
>>> zeller()
Traceback (most recent call last):
...
TypeError: zeller() missing 1 required positional argument: 'date_input'
Test length of date_input:
>>> zeller('')
Traceback (most recent call last):
...
ValueError: Must be 10 characters long
>>> zeller('01-31-19082939')
Traceback (most recent call last):
...
ValueError: Must be 10 characters long"""
# Days of the week for response
days = {
"0": "Sunday",
"1": "Monday",
"2": "Tuesday",
"3": "Wednesday",
"4": "Thursday",
"5": "Friday",
"6": "Saturday",
}
convert_datetime_days = {0: 1, 1: 2, 2: 3, 3: 4, 4: 5, 5: 6, 6: 0}
# Validate
if not 0 < len(date_input) < 11:
raise ValueError("Must be 10 characters long")
# Get month
m: int = int(date_input[0] + date_input[1])
# Validate
if not 0 < m < 13:
raise ValueError("Month must be between 1 - 12")
sep_1: str = date_input[2]
# Validate
if sep_1 not in ["-", "/"]:
raise ValueError("Date separator must be '-' or '/'")
# Get day
d: int = int(date_input[3] + date_input[4])
# Validate
if not 0 < d < 32:
raise ValueError("Date must be between 1 - 31")
# Get second separator
sep_2: str = date_input[5]
# Validate
if sep_2 not in ["-", "/"]:
raise ValueError("Date separator must be '-' or '/'")
# Get year
y: int = int(date_input[6] + date_input[7] + date_input[8] + date_input[9])
# Arbitrary year range
if not 45 < y < 8500:
raise ValueError(
"Year out of range. There has to be some sort of limit...right?"
)
# Get datetime obj for validation
dt_ck = datetime.date(int(y), int(m), int(d))
# Start math
if m <= 2:
y = y - 1
m = m + 12
# maths var
c: int = int(str(y)[:2])
k: int = int(str(y)[2:])
t: int = int(2.6 * m - 5.39)
u: int = int(c / 4)
v: int = int(k / 4)
x: int = int(d + k)
z: int = int(t + u + v + x)
w: int = int(z - (2 * c))
f: int = round(w % 7)
# End math
# Validate math
if f != convert_datetime_days[dt_ck.weekday()]:
raise AssertionError("The date was evaluated incorrectly. Contact developer.")
# Response
response: str = f"Your date {date_input}, is a {days[str(f)]}!"
return response
if __name__ == "__main__":
import doctest
doctest.testmod()
parser = argparse.ArgumentParser(
description=(
"Find out what day of the week nearly any date is or was. Enter "
"date as a string in the mm-dd-yyyy or mm/dd/yyyy format"
)
)
parser.add_argument(
"date_input", type=str, help="Date as a string (mm-dd-yyyy or mm/dd/yyyy)"
)
args = parser.parse_args()
zeller(args.date_input)
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | -1 |
||
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
https://projecteuler.net/problem=234
For an integer n ≥ 4, we define the lower prime square root of n, denoted by
lps(n), as the largest prime ≤ √n and the upper prime square root of n, ups(n),
as the smallest prime ≥ √n.
So, for example, lps(4) = 2 = ups(4), lps(1000) = 31, ups(1000) = 37. Let us
call an integer n ≥ 4 semidivisible, if one of lps(n) and ups(n) divides n,
but not both.
The sum of the semidivisible numbers not exceeding 15 is 30, the numbers are 8,
10 and 12. 15 is not semidivisible because it is a multiple of both lps(15) = 3
and ups(15) = 5. As a further example, the sum of the 92 semidivisible numbers
up to 1000 is 34825.
What is the sum of all semidivisible numbers not exceeding 999966663333 ?
"""
import math
def prime_sieve(n: int) -> list:
"""
Sieve of Erotosthenes
Function to return all the prime numbers up to a certain number
https://en.wikipedia.org/wiki/Sieve_of_Eratosthenes
>>> prime_sieve(3)
[2]
>>> prime_sieve(50)
[2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47]
"""
is_prime = [True] * n
is_prime[0] = False
is_prime[1] = False
is_prime[2] = True
for i in range(3, int(n**0.5 + 1), 2):
index = i * 2
while index < n:
is_prime[index] = False
index = index + i
primes = [2]
for i in range(3, n, 2):
if is_prime[i]:
primes.append(i)
return primes
def solution(limit: int = 999_966_663_333) -> int:
"""
Computes the solution to the problem up to the specified limit
>>> solution(1000)
34825
>>> solution(10_000)
1134942
>>> solution(100_000)
36393008
"""
primes_upper_bound = math.floor(math.sqrt(limit)) + 100
primes = prime_sieve(primes_upper_bound)
matches_sum = 0
prime_index = 0
last_prime = primes[prime_index]
while (last_prime**2) <= limit:
next_prime = primes[prime_index + 1]
lower_bound = last_prime**2
upper_bound = next_prime**2
# Get numbers divisible by lps(current)
current = lower_bound + last_prime
while upper_bound > current <= limit:
matches_sum += current
current += last_prime
# Reset the upper_bound
while (upper_bound - next_prime) > limit:
upper_bound -= next_prime
# Add the numbers divisible by ups(current)
current = upper_bound - next_prime
while current > lower_bound:
matches_sum += current
current -= next_prime
# Remove the numbers divisible by both ups and lps
current = 0
while upper_bound > current <= limit:
if current <= lower_bound:
# Increment the current number
current += last_prime * next_prime
continue
if current > limit:
break
# Remove twice since it was added by both ups and lps
matches_sum -= current * 2
# Increment the current number
current += last_prime * next_prime
# Setup for next pair
last_prime = next_prime
prime_index += 1
return matches_sum
if __name__ == "__main__":
print(solution())
| """
https://projecteuler.net/problem=234
For an integer n ≥ 4, we define the lower prime square root of n, denoted by
lps(n), as the largest prime ≤ √n and the upper prime square root of n, ups(n),
as the smallest prime ≥ √n.
So, for example, lps(4) = 2 = ups(4), lps(1000) = 31, ups(1000) = 37. Let us
call an integer n ≥ 4 semidivisible, if one of lps(n) and ups(n) divides n,
but not both.
The sum of the semidivisible numbers not exceeding 15 is 30, the numbers are 8,
10 and 12. 15 is not semidivisible because it is a multiple of both lps(15) = 3
and ups(15) = 5. As a further example, the sum of the 92 semidivisible numbers
up to 1000 is 34825.
What is the sum of all semidivisible numbers not exceeding 999966663333 ?
"""
import math
def prime_sieve(n: int) -> list:
"""
Sieve of Erotosthenes
Function to return all the prime numbers up to a certain number
https://en.wikipedia.org/wiki/Sieve_of_Eratosthenes
>>> prime_sieve(3)
[2]
>>> prime_sieve(50)
[2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47]
"""
is_prime = [True] * n
is_prime[0] = False
is_prime[1] = False
is_prime[2] = True
for i in range(3, int(n**0.5 + 1), 2):
index = i * 2
while index < n:
is_prime[index] = False
index = index + i
primes = [2]
for i in range(3, n, 2):
if is_prime[i]:
primes.append(i)
return primes
def solution(limit: int = 999_966_663_333) -> int:
"""
Computes the solution to the problem up to the specified limit
>>> solution(1000)
34825
>>> solution(10_000)
1134942
>>> solution(100_000)
36393008
"""
primes_upper_bound = math.floor(math.sqrt(limit)) + 100
primes = prime_sieve(primes_upper_bound)
matches_sum = 0
prime_index = 0
last_prime = primes[prime_index]
while (last_prime**2) <= limit:
next_prime = primes[prime_index + 1]
lower_bound = last_prime**2
upper_bound = next_prime**2
# Get numbers divisible by lps(current)
current = lower_bound + last_prime
while upper_bound > current <= limit:
matches_sum += current
current += last_prime
# Reset the upper_bound
while (upper_bound - next_prime) > limit:
upper_bound -= next_prime
# Add the numbers divisible by ups(current)
current = upper_bound - next_prime
while current > lower_bound:
matches_sum += current
current -= next_prime
# Remove the numbers divisible by both ups and lps
current = 0
while upper_bound > current <= limit:
if current <= lower_bound:
# Increment the current number
current += last_prime * next_prime
continue
if current > limit:
break
# Remove twice since it was added by both ups and lps
matches_sum -= current * 2
# Increment the current number
current += last_prime * next_prime
# Setup for next pair
last_prime = next_prime
prime_index += 1
return matches_sum
if __name__ == "__main__":
print(solution())
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Totient maximum
Problem 69: https://projecteuler.net/problem=69
Euler's Totient function, φ(n) [sometimes called the phi function],
is used to determine the number of numbers less than n which are relatively prime to n.
For example, as 1, 2, 4, 5, 7, and 8,
are all less than nine and relatively prime to nine, φ(9)=6.
n Relatively Prime φ(n) n/φ(n)
2 1 1 2
3 1,2 2 1.5
4 1,3 2 2
5 1,2,3,4 4 1.25
6 1,5 2 3
7 1,2,3,4,5,6 6 1.1666...
8 1,3,5,7 4 2
9 1,2,4,5,7,8 6 1.5
10 1,3,7,9 4 2.5
It can be seen that n=6 produces a maximum n/φ(n) for n ≤ 10.
Find the value of n ≤ 1,000,000 for which n/φ(n) is a maximum.
"""
def solution(n: int = 10**6) -> int:
"""
Returns solution to problem.
Algorithm:
1. Precompute φ(k) for all natural k, k <= n using product formula (wikilink below)
https://en.wikipedia.org/wiki/Euler%27s_totient_function#Euler's_product_formula
2. Find k/φ(k) for all k ≤ n and return the k that attains maximum
>>> solution(10)
6
>>> solution(100)
30
>>> solution(9973)
2310
"""
if n <= 0:
raise ValueError("Please enter an integer greater than 0")
phi = list(range(n + 1))
for number in range(2, n + 1):
if phi[number] == number:
phi[number] -= 1
for multiple in range(number * 2, n + 1, number):
phi[multiple] = (phi[multiple] // number) * (number - 1)
answer = 1
for number in range(1, n + 1):
if (answer / phi[answer]) < (number / phi[number]):
answer = number
return answer
if __name__ == "__main__":
print(solution())
| """
Totient maximum
Problem 69: https://projecteuler.net/problem=69
Euler's Totient function, φ(n) [sometimes called the phi function],
is used to determine the number of numbers less than n which are relatively prime to n.
For example, as 1, 2, 4, 5, 7, and 8,
are all less than nine and relatively prime to nine, φ(9)=6.
n Relatively Prime φ(n) n/φ(n)
2 1 1 2
3 1,2 2 1.5
4 1,3 2 2
5 1,2,3,4 4 1.25
6 1,5 2 3
7 1,2,3,4,5,6 6 1.1666...
8 1,3,5,7 4 2
9 1,2,4,5,7,8 6 1.5
10 1,3,7,9 4 2.5
It can be seen that n=6 produces a maximum n/φ(n) for n ≤ 10.
Find the value of n ≤ 1,000,000 for which n/φ(n) is a maximum.
"""
def solution(n: int = 10**6) -> int:
"""
Returns solution to problem.
Algorithm:
1. Precompute φ(k) for all natural k, k <= n using product formula (wikilink below)
https://en.wikipedia.org/wiki/Euler%27s_totient_function#Euler's_product_formula
2. Find k/φ(k) for all k ≤ n and return the k that attains maximum
>>> solution(10)
6
>>> solution(100)
30
>>> solution(9973)
2310
"""
if n <= 0:
raise ValueError("Please enter an integer greater than 0")
phi = list(range(n + 1))
for number in range(2, n + 1):
if phi[number] == number:
phi[number] -= 1
for multiple in range(number * 2, n + 1, number):
phi[multiple] = (phi[multiple] // number) * (number - 1)
answer = 1
for number in range(1, n + 1):
if (answer / phi[answer]) < (number / phi[number]):
answer = number
return answer
if __name__ == "__main__":
print(solution())
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
https://en.wikipedia.org/wiki/Combination
"""
from math import factorial
def combinations(n: int, k: int) -> int:
"""
Returns the number of different combinations of k length which can
be made from n values, where n >= k.
Examples:
>>> combinations(10,5)
252
>>> combinations(6,3)
20
>>> combinations(20,5)
15504
>>> combinations(52, 5)
2598960
>>> combinations(0, 0)
1
>>> combinations(-4, -5)
...
Traceback (most recent call last):
ValueError: Please enter positive integers for n and k where n >= k
"""
# If either of the conditions are true, the function is being asked
# to calculate a factorial of a negative number, which is not possible
if n < k or k < 0:
raise ValueError("Please enter positive integers for n and k where n >= k")
return int(factorial(n) / ((factorial(k)) * (factorial(n - k))))
if __name__ == "__main__":
print(
"\nThe number of five-card hands possible from a standard",
f"fifty-two card deck is: {combinations(52, 5)}",
)
print(
"\nIf a class of 40 students must be arranged into groups of",
f"4 for group projects, there are {combinations(40, 4)} ways",
"to arrange them.\n",
)
print(
"If 10 teams are competing in a Formula One race, there",
f"are {combinations(10, 3)} ways that first, second and",
"third place can be awarded.\n",
)
| """
https://en.wikipedia.org/wiki/Combination
"""
from math import factorial
def combinations(n: int, k: int) -> int:
"""
Returns the number of different combinations of k length which can
be made from n values, where n >= k.
Examples:
>>> combinations(10,5)
252
>>> combinations(6,3)
20
>>> combinations(20,5)
15504
>>> combinations(52, 5)
2598960
>>> combinations(0, 0)
1
>>> combinations(-4, -5)
...
Traceback (most recent call last):
ValueError: Please enter positive integers for n and k where n >= k
"""
# If either of the conditions are true, the function is being asked
# to calculate a factorial of a negative number, which is not possible
if n < k or k < 0:
raise ValueError("Please enter positive integers for n and k where n >= k")
return int(factorial(n) / ((factorial(k)) * (factorial(n - k))))
if __name__ == "__main__":
print(
"\nThe number of five-card hands possible from a standard",
f"fifty-two card deck is: {combinations(52, 5)}",
)
print(
"\nIf a class of 40 students must be arranged into groups of",
f"4 for group projects, there are {combinations(40, 4)} ways",
"to arrange them.\n",
)
print(
"If 10 teams are competing in a Formula One race, there",
f"are {combinations(10, 3)} ways that first, second and",
"third place can be awarded.\n",
)
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | import re
def indian_phone_validator(phone: str) -> bool:
"""
Determine whether the string is a valid phone number or not
:param phone:
:return: Boolean
>>> indian_phone_validator("+91123456789")
False
>>> indian_phone_validator("+919876543210")
True
>>> indian_phone_validator("01234567896")
False
>>> indian_phone_validator("919876543218")
True
>>> indian_phone_validator("+91-1234567899")
False
>>> indian_phone_validator("+91-9876543218")
True
"""
pat = re.compile(r"^(\+91[\-\s]?)?[0]?(91)?[789]\d{9}$")
match = re.search(pat, phone)
if match:
return match.string == phone
return False
if __name__ == "__main__":
print(indian_phone_validator("+918827897895"))
| import re
def indian_phone_validator(phone: str) -> bool:
"""
Determine whether the string is a valid phone number or not
:param phone:
:return: Boolean
>>> indian_phone_validator("+91123456789")
False
>>> indian_phone_validator("+919876543210")
True
>>> indian_phone_validator("01234567896")
False
>>> indian_phone_validator("919876543218")
True
>>> indian_phone_validator("+91-1234567899")
False
>>> indian_phone_validator("+91-9876543218")
True
"""
pat = re.compile(r"^(\+91[\-\s]?)?[0]?(91)?[789]\d{9}$")
match = re.search(pat, phone)
if match:
return match.string == phone
return False
if __name__ == "__main__":
print(indian_phone_validator("+918827897895"))
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Linear regression is the most basic type of regression commonly used for
predictive analysis. The idea is pretty simple: we have a dataset and we have
features associated with it. Features should be chosen very cautiously
as they determine how much our model will be able to make future predictions.
We try to set the weight of these features, over many iterations, so that they best
fit our dataset. In this particular code, I had used a CSGO dataset (ADR vs
Rating). We try to best fit a line through dataset and estimate the parameters.
"""
import numpy as np
import requests
def collect_dataset():
"""Collect dataset of CSGO
The dataset contains ADR vs Rating of a Player
:return : dataset obtained from the link, as matrix
"""
response = requests.get(
"https://raw.githubusercontent.com/yashLadha/"
+ "The_Math_of_Intelligence/master/Week1/ADRvs"
+ "Rating.csv"
)
lines = response.text.splitlines()
data = []
for item in lines:
item = item.split(",")
data.append(item)
data.pop(0) # This is for removing the labels from the list
dataset = np.matrix(data)
return dataset
def run_steep_gradient_descent(data_x, data_y, len_data, alpha, theta):
"""Run steep gradient descent and updates the Feature vector accordingly_
:param data_x : contains the dataset
:param data_y : contains the output associated with each data-entry
:param len_data : length of the data_
:param alpha : Learning rate of the model
:param theta : Feature vector (weight's for our model)
;param return : Updated Feature's, using
curr_features - alpha_ * gradient(w.r.t. feature)
"""
n = len_data
prod = np.dot(theta, data_x.transpose())
prod -= data_y.transpose()
sum_grad = np.dot(prod, data_x)
theta = theta - (alpha / n) * sum_grad
return theta
def sum_of_square_error(data_x, data_y, len_data, theta):
"""Return sum of square error for error calculation
:param data_x : contains our dataset
:param data_y : contains the output (result vector)
:param len_data : len of the dataset
:param theta : contains the feature vector
:return : sum of square error computed from given feature's
"""
prod = np.dot(theta, data_x.transpose())
prod -= data_y.transpose()
sum_elem = np.sum(np.square(prod))
error = sum_elem / (2 * len_data)
return error
def run_linear_regression(data_x, data_y):
"""Implement Linear regression over the dataset
:param data_x : contains our dataset
:param data_y : contains the output (result vector)
:return : feature for line of best fit (Feature vector)
"""
iterations = 100000
alpha = 0.0001550
no_features = data_x.shape[1]
len_data = data_x.shape[0] - 1
theta = np.zeros((1, no_features))
for i in range(0, iterations):
theta = run_steep_gradient_descent(data_x, data_y, len_data, alpha, theta)
error = sum_of_square_error(data_x, data_y, len_data, theta)
print("At Iteration %d - Error is %.5f " % (i + 1, error))
return theta
def main():
"""Driver function"""
data = collect_dataset()
len_data = data.shape[0]
data_x = np.c_[np.ones(len_data), data[:, :-1]].astype(float)
data_y = data[:, -1].astype(float)
theta = run_linear_regression(data_x, data_y)
len_result = theta.shape[1]
print("Resultant Feature vector : ")
for i in range(0, len_result):
print(f"{theta[0, i]:.5f}")
if __name__ == "__main__":
main()
| """
Linear regression is the most basic type of regression commonly used for
predictive analysis. The idea is pretty simple: we have a dataset and we have
features associated with it. Features should be chosen very cautiously
as they determine how much our model will be able to make future predictions.
We try to set the weight of these features, over many iterations, so that they best
fit our dataset. In this particular code, I had used a CSGO dataset (ADR vs
Rating). We try to best fit a line through dataset and estimate the parameters.
"""
import numpy as np
import requests
def collect_dataset():
"""Collect dataset of CSGO
The dataset contains ADR vs Rating of a Player
:return : dataset obtained from the link, as matrix
"""
response = requests.get(
"https://raw.githubusercontent.com/yashLadha/"
+ "The_Math_of_Intelligence/master/Week1/ADRvs"
+ "Rating.csv"
)
lines = response.text.splitlines()
data = []
for item in lines:
item = item.split(",")
data.append(item)
data.pop(0) # This is for removing the labels from the list
dataset = np.matrix(data)
return dataset
def run_steep_gradient_descent(data_x, data_y, len_data, alpha, theta):
"""Run steep gradient descent and updates the Feature vector accordingly_
:param data_x : contains the dataset
:param data_y : contains the output associated with each data-entry
:param len_data : length of the data_
:param alpha : Learning rate of the model
:param theta : Feature vector (weight's for our model)
;param return : Updated Feature's, using
curr_features - alpha_ * gradient(w.r.t. feature)
"""
n = len_data
prod = np.dot(theta, data_x.transpose())
prod -= data_y.transpose()
sum_grad = np.dot(prod, data_x)
theta = theta - (alpha / n) * sum_grad
return theta
def sum_of_square_error(data_x, data_y, len_data, theta):
"""Return sum of square error for error calculation
:param data_x : contains our dataset
:param data_y : contains the output (result vector)
:param len_data : len of the dataset
:param theta : contains the feature vector
:return : sum of square error computed from given feature's
"""
prod = np.dot(theta, data_x.transpose())
prod -= data_y.transpose()
sum_elem = np.sum(np.square(prod))
error = sum_elem / (2 * len_data)
return error
def run_linear_regression(data_x, data_y):
"""Implement Linear regression over the dataset
:param data_x : contains our dataset
:param data_y : contains the output (result vector)
:return : feature for line of best fit (Feature vector)
"""
iterations = 100000
alpha = 0.0001550
no_features = data_x.shape[1]
len_data = data_x.shape[0] - 1
theta = np.zeros((1, no_features))
for i in range(0, iterations):
theta = run_steep_gradient_descent(data_x, data_y, len_data, alpha, theta)
error = sum_of_square_error(data_x, data_y, len_data, theta)
print("At Iteration %d - Error is %.5f " % (i + 1, error))
return theta
def main():
"""Driver function"""
data = collect_dataset()
len_data = data.shape[0]
data_x = np.c_[np.ones(len_data), data[:, :-1]].astype(float)
data_y = data[:, -1].astype(float)
theta = run_linear_regression(data_x, data_y)
len_result = theta.shape[1]
print("Resultant Feature vector : ")
for i in range(0, len_result):
print(f"{theta[0, i]:.5f}")
if __name__ == "__main__":
main()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | from datetime import datetime
import requests
from bs4 import BeautifulSoup
if __name__ == "__main__":
url = input("Enter image url: ").strip()
print(f"Downloading image from {url} ...")
soup = BeautifulSoup(requests.get(url).content, "html.parser")
# The image URL is in the content field of the first meta tag with property og:image
image_url = soup.find("meta", {"property": "og:image"})["content"]
image_data = requests.get(image_url).content
file_name = f"{datetime.now():%Y-%m-%d_%H:%M:%S}.jpg"
with open(file_name, "wb") as fp:
fp.write(image_data)
print(f"Done. Image saved to disk as {file_name}.")
| from datetime import datetime
import requests
from bs4 import BeautifulSoup
if __name__ == "__main__":
url = input("Enter image url: ").strip()
print(f"Downloading image from {url} ...")
soup = BeautifulSoup(requests.get(url).content, "html.parser")
# The image URL is in the content field of the first meta tag with property og:image
image_url = soup.find("meta", {"property": "og:image"})["content"]
image_data = requests.get(image_url).content
file_name = f"{datetime.now():%Y-%m-%d_%H:%M:%S}.jpg"
with open(file_name, "wb") as fp:
fp.write(image_data)
print(f"Done. Image saved to disk as {file_name}.")
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | # Python program for generating diamond pattern in Python 3.7+
# Function to print upper half of diamond (pyramid)
def floyd(n):
"""
Parameters:
n : size of pattern
"""
for i in range(0, n):
for j in range(0, n - i - 1): # printing spaces
print(" ", end="")
for k in range(0, i + 1): # printing stars
print("* ", end="")
print()
# Function to print lower half of diamond (pyramid)
def reverse_floyd(n):
"""
Parameters:
n : size of pattern
"""
for i in range(n, 0, -1):
for j in range(i, 0, -1): # printing stars
print("* ", end="")
print()
for k in range(n - i + 1, 0, -1): # printing spaces
print(" ", end="")
# Function to print complete diamond pattern of "*"
def pretty_print(n):
"""
Parameters:
n : size of pattern
"""
if n <= 0:
print(" ... .... nothing printing :(")
return
floyd(n) # upper half
reverse_floyd(n) # lower half
if __name__ == "__main__":
print(r"| /\ | |- | |- |--| |\ /| |-")
print(r"|/ \| |- |_ |_ |__| | \/ | |_")
K = 1
while K:
user_number = int(input("enter the number and , and see the magic : "))
print()
pretty_print(user_number)
K = int(input("press 0 to exit... and 1 to continue..."))
print("Good Bye...")
| # Python program for generating diamond pattern in Python 3.7+
# Function to print upper half of diamond (pyramid)
def floyd(n):
"""
Parameters:
n : size of pattern
"""
for i in range(0, n):
for j in range(0, n - i - 1): # printing spaces
print(" ", end="")
for k in range(0, i + 1): # printing stars
print("* ", end="")
print()
# Function to print lower half of diamond (pyramid)
def reverse_floyd(n):
"""
Parameters:
n : size of pattern
"""
for i in range(n, 0, -1):
for j in range(i, 0, -1): # printing stars
print("* ", end="")
print()
for k in range(n - i + 1, 0, -1): # printing spaces
print(" ", end="")
# Function to print complete diamond pattern of "*"
def pretty_print(n):
"""
Parameters:
n : size of pattern
"""
if n <= 0:
print(" ... .... nothing printing :(")
return
floyd(n) # upper half
reverse_floyd(n) # lower half
if __name__ == "__main__":
print(r"| /\ | |- | |- |--| |\ /| |-")
print(r"|/ \| |- |_ |_ |__| | \/ | |_")
K = 1
while K:
user_number = int(input("enter the number and , and see the magic : "))
print()
pretty_print(user_number)
K = int(input("press 0 to exit... and 1 to continue..."))
print("Good Bye...")
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | # Boolean Algebra
Boolean algebra is used to do arithmetic with bits of values True (1) or False (0).
There are three basic operations: 'and', 'or' and 'not'.
* <https://en.wikipedia.org/wiki/Boolean_algebra>
* <https://plato.stanford.edu/entries/boolalg-math/>
| # Boolean Algebra
Boolean algebra is used to do arithmetic with bits of values True (1) or False (0).
There are three basic operations: 'and', 'or' and 'not'.
* <https://en.wikipedia.org/wiki/Boolean_algebra>
* <https://plato.stanford.edu/entries/boolalg-math/>
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Project Euler Problem 5: https://projecteuler.net/problem=5
Smallest multiple
2520 is the smallest number that can be divided by each of the numbers
from 1 to 10 without any remainder.
What is the smallest positive number that is _evenly divisible_ by all
of the numbers from 1 to 20?
References:
- https://en.wiktionary.org/wiki/evenly_divisible
- https://en.wikipedia.org/wiki/Euclidean_algorithm
- https://en.wikipedia.org/wiki/Least_common_multiple
"""
def greatest_common_divisor(x: int, y: int) -> int:
"""
Euclidean Greatest Common Divisor algorithm
>>> greatest_common_divisor(0, 0)
0
>>> greatest_common_divisor(23, 42)
1
>>> greatest_common_divisor(15, 33)
3
>>> greatest_common_divisor(12345, 67890)
15
"""
return x if y == 0 else greatest_common_divisor(y, x % y)
def lcm(x: int, y: int) -> int:
"""
Least Common Multiple.
Using the property that lcm(a, b) * greatest_common_divisor(a, b) = a*b
>>> lcm(3, 15)
15
>>> lcm(1, 27)
27
>>> lcm(13, 27)
351
>>> lcm(64, 48)
192
"""
return (x * y) // greatest_common_divisor(x, y)
def solution(n: int = 20) -> int:
"""
Returns the smallest positive number that is evenly divisible (divisible
with no remainder) by all of the numbers from 1 to n.
>>> solution(10)
2520
>>> solution(15)
360360
>>> solution(22)
232792560
"""
g = 1
for i in range(1, n + 1):
g = lcm(g, i)
return g
if __name__ == "__main__":
print(f"{solution() = }")
| """
Project Euler Problem 5: https://projecteuler.net/problem=5
Smallest multiple
2520 is the smallest number that can be divided by each of the numbers
from 1 to 10 without any remainder.
What is the smallest positive number that is _evenly divisible_ by all
of the numbers from 1 to 20?
References:
- https://en.wiktionary.org/wiki/evenly_divisible
- https://en.wikipedia.org/wiki/Euclidean_algorithm
- https://en.wikipedia.org/wiki/Least_common_multiple
"""
def greatest_common_divisor(x: int, y: int) -> int:
"""
Euclidean Greatest Common Divisor algorithm
>>> greatest_common_divisor(0, 0)
0
>>> greatest_common_divisor(23, 42)
1
>>> greatest_common_divisor(15, 33)
3
>>> greatest_common_divisor(12345, 67890)
15
"""
return x if y == 0 else greatest_common_divisor(y, x % y)
def lcm(x: int, y: int) -> int:
"""
Least Common Multiple.
Using the property that lcm(a, b) * greatest_common_divisor(a, b) = a*b
>>> lcm(3, 15)
15
>>> lcm(1, 27)
27
>>> lcm(13, 27)
351
>>> lcm(64, 48)
192
"""
return (x * y) // greatest_common_divisor(x, y)
def solution(n: int = 20) -> int:
"""
Returns the smallest positive number that is evenly divisible (divisible
with no remainder) by all of the numbers from 1 to n.
>>> solution(10)
2520
>>> solution(15)
360360
>>> solution(22)
232792560
"""
g = 1
for i in range(1, n + 1):
g = lcm(g, i)
return g
if __name__ == "__main__":
print(f"{solution() = }")
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | #!/usr/bin/env python3
# This Python program implements an optimal binary search tree (abbreviated BST)
# building dynamic programming algorithm that delivers O(n^2) performance.
#
# The goal of the optimal BST problem is to build a low-cost BST for a
# given set of nodes, each with its own key and frequency. The frequency
# of the node is defined as how many time the node is being searched.
# The search cost of binary search tree is given by this formula:
#
# cost(1, n) = sum{i = 1 to n}((depth(node_i) + 1) * node_i_freq)
#
# where n is number of nodes in the BST. The characteristic of low-cost
# BSTs is having a faster overall search time than other implementations.
# The reason for their fast search time is that the nodes with high
# frequencies will be placed near the root of the tree while the nodes
# with low frequencies will be placed near the leaves of the tree thus
# reducing search time in the most frequent instances.
import sys
from random import randint
class Node:
"""Binary Search Tree Node"""
def __init__(self, key, freq):
self.key = key
self.freq = freq
def __str__(self):
"""
>>> str(Node(1, 2))
'Node(key=1, freq=2)'
"""
return f"Node(key={self.key}, freq={self.freq})"
def print_binary_search_tree(root, key, i, j, parent, is_left):
"""
Recursive function to print a BST from a root table.
>>> key = [3, 8, 9, 10, 17, 21]
>>> root = [[0, 1, 1, 1, 1, 1], [0, 1, 1, 1, 1, 3], [0, 0, 2, 3, 3, 3], \
[0, 0, 0, 3, 3, 3], [0, 0, 0, 0, 4, 5], [0, 0, 0, 0, 0, 5]]
>>> print_binary_search_tree(root, key, 0, 5, -1, False)
8 is the root of the binary search tree.
3 is the left child of key 8.
10 is the right child of key 8.
9 is the left child of key 10.
21 is the right child of key 10.
17 is the left child of key 21.
"""
if i > j or i < 0 or j > len(root) - 1:
return
node = root[i][j]
if parent == -1: # root does not have a parent
print(f"{key[node]} is the root of the binary search tree.")
elif is_left:
print(f"{key[node]} is the left child of key {parent}.")
else:
print(f"{key[node]} is the right child of key {parent}.")
print_binary_search_tree(root, key, i, node - 1, key[node], True)
print_binary_search_tree(root, key, node + 1, j, key[node], False)
def find_optimal_binary_search_tree(nodes):
"""
This function calculates and prints the optimal binary search tree.
The dynamic programming algorithm below runs in O(n^2) time.
Implemented from CLRS (Introduction to Algorithms) book.
https://en.wikipedia.org/wiki/Introduction_to_Algorithms
>>> find_optimal_binary_search_tree([Node(12, 8), Node(10, 34), Node(20, 50), \
Node(42, 3), Node(25, 40), Node(37, 30)])
Binary search tree nodes:
Node(key=10, freq=34)
Node(key=12, freq=8)
Node(key=20, freq=50)
Node(key=25, freq=40)
Node(key=37, freq=30)
Node(key=42, freq=3)
<BLANKLINE>
The cost of optimal BST for given tree nodes is 324.
20 is the root of the binary search tree.
10 is the left child of key 20.
12 is the right child of key 10.
25 is the right child of key 20.
37 is the right child of key 25.
42 is the right child of key 37.
"""
# Tree nodes must be sorted first, the code below sorts the keys in
# increasing order and rearrange its frequencies accordingly.
nodes.sort(key=lambda node: node.key)
n = len(nodes)
keys = [nodes[i].key for i in range(n)]
freqs = [nodes[i].freq for i in range(n)]
# This 2D array stores the overall tree cost (which's as minimized as possible);
# for a single key, cost is equal to frequency of the key.
dp = [[freqs[i] if i == j else 0 for j in range(n)] for i in range(n)]
# sum[i][j] stores the sum of key frequencies between i and j inclusive in nodes
# array
sum = [[freqs[i] if i == j else 0 for j in range(n)] for i in range(n)]
# stores tree roots that will be used later for constructing binary search tree
root = [[i if i == j else 0 for j in range(n)] for i in range(n)]
for interval_length in range(2, n + 1):
for i in range(n - interval_length + 1):
j = i + interval_length - 1
dp[i][j] = sys.maxsize # set the value to "infinity"
sum[i][j] = sum[i][j - 1] + freqs[j]
# Apply Knuth's optimization
# Loop without optimization: for r in range(i, j + 1):
for r in range(root[i][j - 1], root[i + 1][j] + 1): # r is a temporal root
left = dp[i][r - 1] if r != i else 0 # optimal cost for left subtree
right = dp[r + 1][j] if r != j else 0 # optimal cost for right subtree
cost = left + sum[i][j] + right
if dp[i][j] > cost:
dp[i][j] = cost
root[i][j] = r
print("Binary search tree nodes:")
for node in nodes:
print(node)
print(f"\nThe cost of optimal BST for given tree nodes is {dp[0][n - 1]}.")
print_binary_search_tree(root, keys, 0, n - 1, -1, False)
def main():
# A sample binary search tree
nodes = [Node(i, randint(1, 50)) for i in range(10, 0, -1)]
find_optimal_binary_search_tree(nodes)
if __name__ == "__main__":
main()
| #!/usr/bin/env python3
# This Python program implements an optimal binary search tree (abbreviated BST)
# building dynamic programming algorithm that delivers O(n^2) performance.
#
# The goal of the optimal BST problem is to build a low-cost BST for a
# given set of nodes, each with its own key and frequency. The frequency
# of the node is defined as how many time the node is being searched.
# The search cost of binary search tree is given by this formula:
#
# cost(1, n) = sum{i = 1 to n}((depth(node_i) + 1) * node_i_freq)
#
# where n is number of nodes in the BST. The characteristic of low-cost
# BSTs is having a faster overall search time than other implementations.
# The reason for their fast search time is that the nodes with high
# frequencies will be placed near the root of the tree while the nodes
# with low frequencies will be placed near the leaves of the tree thus
# reducing search time in the most frequent instances.
import sys
from random import randint
class Node:
"""Binary Search Tree Node"""
def __init__(self, key, freq):
self.key = key
self.freq = freq
def __str__(self):
"""
>>> str(Node(1, 2))
'Node(key=1, freq=2)'
"""
return f"Node(key={self.key}, freq={self.freq})"
def print_binary_search_tree(root, key, i, j, parent, is_left):
"""
Recursive function to print a BST from a root table.
>>> key = [3, 8, 9, 10, 17, 21]
>>> root = [[0, 1, 1, 1, 1, 1], [0, 1, 1, 1, 1, 3], [0, 0, 2, 3, 3, 3], \
[0, 0, 0, 3, 3, 3], [0, 0, 0, 0, 4, 5], [0, 0, 0, 0, 0, 5]]
>>> print_binary_search_tree(root, key, 0, 5, -1, False)
8 is the root of the binary search tree.
3 is the left child of key 8.
10 is the right child of key 8.
9 is the left child of key 10.
21 is the right child of key 10.
17 is the left child of key 21.
"""
if i > j or i < 0 or j > len(root) - 1:
return
node = root[i][j]
if parent == -1: # root does not have a parent
print(f"{key[node]} is the root of the binary search tree.")
elif is_left:
print(f"{key[node]} is the left child of key {parent}.")
else:
print(f"{key[node]} is the right child of key {parent}.")
print_binary_search_tree(root, key, i, node - 1, key[node], True)
print_binary_search_tree(root, key, node + 1, j, key[node], False)
def find_optimal_binary_search_tree(nodes):
"""
This function calculates and prints the optimal binary search tree.
The dynamic programming algorithm below runs in O(n^2) time.
Implemented from CLRS (Introduction to Algorithms) book.
https://en.wikipedia.org/wiki/Introduction_to_Algorithms
>>> find_optimal_binary_search_tree([Node(12, 8), Node(10, 34), Node(20, 50), \
Node(42, 3), Node(25, 40), Node(37, 30)])
Binary search tree nodes:
Node(key=10, freq=34)
Node(key=12, freq=8)
Node(key=20, freq=50)
Node(key=25, freq=40)
Node(key=37, freq=30)
Node(key=42, freq=3)
<BLANKLINE>
The cost of optimal BST for given tree nodes is 324.
20 is the root of the binary search tree.
10 is the left child of key 20.
12 is the right child of key 10.
25 is the right child of key 20.
37 is the right child of key 25.
42 is the right child of key 37.
"""
# Tree nodes must be sorted first, the code below sorts the keys in
# increasing order and rearrange its frequencies accordingly.
nodes.sort(key=lambda node: node.key)
n = len(nodes)
keys = [nodes[i].key for i in range(n)]
freqs = [nodes[i].freq for i in range(n)]
# This 2D array stores the overall tree cost (which's as minimized as possible);
# for a single key, cost is equal to frequency of the key.
dp = [[freqs[i] if i == j else 0 for j in range(n)] for i in range(n)]
# sum[i][j] stores the sum of key frequencies between i and j inclusive in nodes
# array
sum = [[freqs[i] if i == j else 0 for j in range(n)] for i in range(n)]
# stores tree roots that will be used later for constructing binary search tree
root = [[i if i == j else 0 for j in range(n)] for i in range(n)]
for interval_length in range(2, n + 1):
for i in range(n - interval_length + 1):
j = i + interval_length - 1
dp[i][j] = sys.maxsize # set the value to "infinity"
sum[i][j] = sum[i][j - 1] + freqs[j]
# Apply Knuth's optimization
# Loop without optimization: for r in range(i, j + 1):
for r in range(root[i][j - 1], root[i + 1][j] + 1): # r is a temporal root
left = dp[i][r - 1] if r != i else 0 # optimal cost for left subtree
right = dp[r + 1][j] if r != j else 0 # optimal cost for right subtree
cost = left + sum[i][j] + right
if dp[i][j] > cost:
dp[i][j] = cost
root[i][j] = r
print("Binary search tree nodes:")
for node in nodes:
print(node)
print(f"\nThe cost of optimal BST for given tree nodes is {dp[0][n - 1]}.")
print_binary_search_tree(root, keys, 0, n - 1, -1, False)
def main():
# A sample binary search tree
nodes = [Node(i, randint(1, 50)) for i in range(10, 0, -1)]
find_optimal_binary_search_tree(nodes)
if __name__ == "__main__":
main()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Wavelet tree is a data-structure designed to efficiently answer various range queries
for arrays. Wavelets trees are different from other binary trees in the sense that
the nodes are split based on the actual values of the elements and not on indices,
such as the with segment trees or fenwick trees. You can read more about them here:
1. https://users.dcc.uchile.cl/~jperez/papers/ioiconf16.pdf
2. https://www.youtube.com/watch?v=4aSv9PcecDw&t=811s
3. https://www.youtube.com/watch?v=CybAgVF-MMc&t=1178s
"""
from __future__ import annotations
test_array = [2, 1, 4, 5, 6, 0, 8, 9, 1, 2, 0, 6, 4, 2, 0, 6, 5, 3, 2, 7]
class Node:
def __init__(self, length: int) -> None:
self.minn: int = -1
self.maxx: int = -1
self.map_left: list[int] = [-1] * length
self.left: Node | None = None
self.right: Node | None = None
def __repr__(self) -> str:
"""
>>> node = Node(length=27)
>>> repr(node)
'min_value: -1, max_value: -1'
>>> repr(node) == str(node)
True
"""
return f"min_value: {self.minn}, max_value: {self.maxx}"
def build_tree(arr: list[int]) -> Node | None:
"""
Builds the tree for arr and returns the root
of the constructed tree
>>> build_tree(test_array)
min_value: 0, max_value: 9
"""
root = Node(len(arr))
root.minn, root.maxx = min(arr), max(arr)
# Leaf node case where the node contains only one unique value
if root.minn == root.maxx:
return root
"""
Take the mean of min and max element of arr as the pivot and
partition arr into left_arr and right_arr with all elements <= pivot in the
left_arr and the rest in right_arr, maintaining the order of the elements,
then recursively build trees for left_arr and right_arr
"""
pivot = (root.minn + root.maxx) // 2
left_arr: list[int] = []
right_arr: list[int] = []
for index, num in enumerate(arr):
if num <= pivot:
left_arr.append(num)
else:
right_arr.append(num)
root.map_left[index] = len(left_arr)
root.left = build_tree(left_arr)
root.right = build_tree(right_arr)
return root
def rank_till_index(node: Node | None, num: int, index: int) -> int:
"""
Returns the number of occurrences of num in interval [0, index] in the list
>>> root = build_tree(test_array)
>>> rank_till_index(root, 6, 6)
1
>>> rank_till_index(root, 2, 0)
1
>>> rank_till_index(root, 1, 10)
2
>>> rank_till_index(root, 17, 7)
0
>>> rank_till_index(root, 0, 9)
1
"""
if index < 0 or node is None:
return 0
# Leaf node cases
if node.minn == node.maxx:
return index + 1 if node.minn == num else 0
pivot = (node.minn + node.maxx) // 2
if num <= pivot:
# go the left subtree and map index to the left subtree
return rank_till_index(node.left, num, node.map_left[index] - 1)
else:
# go to the right subtree and map index to the right subtree
return rank_till_index(node.right, num, index - node.map_left[index])
def rank(node: Node | None, num: int, start: int, end: int) -> int:
"""
Returns the number of occurrences of num in interval [start, end] in the list
>>> root = build_tree(test_array)
>>> rank(root, 6, 3, 13)
2
>>> rank(root, 2, 0, 19)
4
>>> rank(root, 9, 2 ,2)
0
>>> rank(root, 0, 5, 10)
2
"""
if start > end:
return 0
rank_till_end = rank_till_index(node, num, end)
rank_before_start = rank_till_index(node, num, start - 1)
return rank_till_end - rank_before_start
def quantile(node: Node | None, index: int, start: int, end: int) -> int:
"""
Returns the index'th smallest element in interval [start, end] in the list
index is 0-indexed
>>> root = build_tree(test_array)
>>> quantile(root, 2, 2, 5)
5
>>> quantile(root, 5, 2, 13)
4
>>> quantile(root, 0, 6, 6)
8
>>> quantile(root, 4, 2, 5)
-1
"""
if index > (end - start) or start > end or node is None:
return -1
# Leaf node case
if node.minn == node.maxx:
return node.minn
# Number of elements in the left subtree in interval [start, end]
num_elements_in_left_tree = node.map_left[end] - (
node.map_left[start - 1] if start else 0
)
if num_elements_in_left_tree > index:
return quantile(
node.left,
index,
(node.map_left[start - 1] if start else 0),
node.map_left[end] - 1,
)
else:
return quantile(
node.right,
index - num_elements_in_left_tree,
start - (node.map_left[start - 1] if start else 0),
end - node.map_left[end],
)
def range_counting(
node: Node | None, start: int, end: int, start_num: int, end_num: int
) -> int:
"""
Returns the number of elements in range [start_num, end_num]
in interval [start, end] in the list
>>> root = build_tree(test_array)
>>> range_counting(root, 1, 10, 3, 7)
3
>>> range_counting(root, 2, 2, 1, 4)
1
>>> range_counting(root, 0, 19, 0, 100)
20
>>> range_counting(root, 1, 0, 1, 100)
0
>>> range_counting(root, 0, 17, 100, 1)
0
"""
if (
start > end
or node is None
or start_num > end_num
or node.minn > end_num
or node.maxx < start_num
):
return 0
if start_num <= node.minn and node.maxx <= end_num:
return end - start + 1
left = range_counting(
node.left,
(node.map_left[start - 1] if start else 0),
node.map_left[end] - 1,
start_num,
end_num,
)
right = range_counting(
node.right,
start - (node.map_left[start - 1] if start else 0),
end - node.map_left[end],
start_num,
end_num,
)
return left + right
if __name__ == "__main__":
import doctest
doctest.testmod()
| """
Wavelet tree is a data-structure designed to efficiently answer various range queries
for arrays. Wavelets trees are different from other binary trees in the sense that
the nodes are split based on the actual values of the elements and not on indices,
such as the with segment trees or fenwick trees. You can read more about them here:
1. https://users.dcc.uchile.cl/~jperez/papers/ioiconf16.pdf
2. https://www.youtube.com/watch?v=4aSv9PcecDw&t=811s
3. https://www.youtube.com/watch?v=CybAgVF-MMc&t=1178s
"""
from __future__ import annotations
test_array = [2, 1, 4, 5, 6, 0, 8, 9, 1, 2, 0, 6, 4, 2, 0, 6, 5, 3, 2, 7]
class Node:
def __init__(self, length: int) -> None:
self.minn: int = -1
self.maxx: int = -1
self.map_left: list[int] = [-1] * length
self.left: Node | None = None
self.right: Node | None = None
def __repr__(self) -> str:
"""
>>> node = Node(length=27)
>>> repr(node)
'min_value: -1, max_value: -1'
>>> repr(node) == str(node)
True
"""
return f"min_value: {self.minn}, max_value: {self.maxx}"
def build_tree(arr: list[int]) -> Node | None:
"""
Builds the tree for arr and returns the root
of the constructed tree
>>> build_tree(test_array)
min_value: 0, max_value: 9
"""
root = Node(len(arr))
root.minn, root.maxx = min(arr), max(arr)
# Leaf node case where the node contains only one unique value
if root.minn == root.maxx:
return root
"""
Take the mean of min and max element of arr as the pivot and
partition arr into left_arr and right_arr with all elements <= pivot in the
left_arr and the rest in right_arr, maintaining the order of the elements,
then recursively build trees for left_arr and right_arr
"""
pivot = (root.minn + root.maxx) // 2
left_arr: list[int] = []
right_arr: list[int] = []
for index, num in enumerate(arr):
if num <= pivot:
left_arr.append(num)
else:
right_arr.append(num)
root.map_left[index] = len(left_arr)
root.left = build_tree(left_arr)
root.right = build_tree(right_arr)
return root
def rank_till_index(node: Node | None, num: int, index: int) -> int:
"""
Returns the number of occurrences of num in interval [0, index] in the list
>>> root = build_tree(test_array)
>>> rank_till_index(root, 6, 6)
1
>>> rank_till_index(root, 2, 0)
1
>>> rank_till_index(root, 1, 10)
2
>>> rank_till_index(root, 17, 7)
0
>>> rank_till_index(root, 0, 9)
1
"""
if index < 0 or node is None:
return 0
# Leaf node cases
if node.minn == node.maxx:
return index + 1 if node.minn == num else 0
pivot = (node.minn + node.maxx) // 2
if num <= pivot:
# go the left subtree and map index to the left subtree
return rank_till_index(node.left, num, node.map_left[index] - 1)
else:
# go to the right subtree and map index to the right subtree
return rank_till_index(node.right, num, index - node.map_left[index])
def rank(node: Node | None, num: int, start: int, end: int) -> int:
"""
Returns the number of occurrences of num in interval [start, end] in the list
>>> root = build_tree(test_array)
>>> rank(root, 6, 3, 13)
2
>>> rank(root, 2, 0, 19)
4
>>> rank(root, 9, 2 ,2)
0
>>> rank(root, 0, 5, 10)
2
"""
if start > end:
return 0
rank_till_end = rank_till_index(node, num, end)
rank_before_start = rank_till_index(node, num, start - 1)
return rank_till_end - rank_before_start
def quantile(node: Node | None, index: int, start: int, end: int) -> int:
"""
Returns the index'th smallest element in interval [start, end] in the list
index is 0-indexed
>>> root = build_tree(test_array)
>>> quantile(root, 2, 2, 5)
5
>>> quantile(root, 5, 2, 13)
4
>>> quantile(root, 0, 6, 6)
8
>>> quantile(root, 4, 2, 5)
-1
"""
if index > (end - start) or start > end or node is None:
return -1
# Leaf node case
if node.minn == node.maxx:
return node.minn
# Number of elements in the left subtree in interval [start, end]
num_elements_in_left_tree = node.map_left[end] - (
node.map_left[start - 1] if start else 0
)
if num_elements_in_left_tree > index:
return quantile(
node.left,
index,
(node.map_left[start - 1] if start else 0),
node.map_left[end] - 1,
)
else:
return quantile(
node.right,
index - num_elements_in_left_tree,
start - (node.map_left[start - 1] if start else 0),
end - node.map_left[end],
)
def range_counting(
node: Node | None, start: int, end: int, start_num: int, end_num: int
) -> int:
"""
Returns the number of elements in range [start_num, end_num]
in interval [start, end] in the list
>>> root = build_tree(test_array)
>>> range_counting(root, 1, 10, 3, 7)
3
>>> range_counting(root, 2, 2, 1, 4)
1
>>> range_counting(root, 0, 19, 0, 100)
20
>>> range_counting(root, 1, 0, 1, 100)
0
>>> range_counting(root, 0, 17, 100, 1)
0
"""
if (
start > end
or node is None
or start_num > end_num
or node.minn > end_num
or node.maxx < start_num
):
return 0
if start_num <= node.minn and node.maxx <= end_num:
return end - start + 1
left = range_counting(
node.left,
(node.map_left[start - 1] if start else 0),
node.map_left[end] - 1,
start_num,
end_num,
)
right = range_counting(
node.right,
start - (node.map_left[start - 1] if start else 0),
end - node.map_left[end],
start_num,
end_num,
)
return left + right
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | def get_word_pattern(word: str) -> str:
"""
>>> get_word_pattern("pattern")
'0.1.2.2.3.4.5'
>>> get_word_pattern("word pattern")
'0.1.2.3.4.5.6.7.7.8.2.9'
>>> get_word_pattern("get word pattern")
'0.1.2.3.4.5.6.7.3.8.9.2.2.1.6.10'
"""
word = word.upper()
next_num = 0
letter_nums = {}
word_pattern = []
for letter in word:
if letter not in letter_nums:
letter_nums[letter] = str(next_num)
next_num += 1
word_pattern.append(letter_nums[letter])
return ".".join(word_pattern)
if __name__ == "__main__":
import pprint
import time
start_time = time.time()
with open("dictionary.txt") as in_file:
wordList = in_file.read().splitlines()
all_patterns: dict = {}
for word in wordList:
pattern = get_word_pattern(word)
if pattern in all_patterns:
all_patterns[pattern].append(word)
else:
all_patterns[pattern] = [word]
with open("word_patterns.txt", "w") as out_file:
out_file.write(pprint.pformat(all_patterns))
totalTime = round(time.time() - start_time, 2)
print(f"Done! {len(all_patterns):,} word patterns found in {totalTime} seconds.")
# Done! 9,581 word patterns found in 0.58 seconds.
| def get_word_pattern(word: str) -> str:
"""
>>> get_word_pattern("pattern")
'0.1.2.2.3.4.5'
>>> get_word_pattern("word pattern")
'0.1.2.3.4.5.6.7.7.8.2.9'
>>> get_word_pattern("get word pattern")
'0.1.2.3.4.5.6.7.3.8.9.2.2.1.6.10'
"""
word = word.upper()
next_num = 0
letter_nums = {}
word_pattern = []
for letter in word:
if letter not in letter_nums:
letter_nums[letter] = str(next_num)
next_num += 1
word_pattern.append(letter_nums[letter])
return ".".join(word_pattern)
if __name__ == "__main__":
import pprint
import time
start_time = time.time()
with open("dictionary.txt") as in_file:
wordList = in_file.read().splitlines()
all_patterns: dict = {}
for word in wordList:
pattern = get_word_pattern(word)
if pattern in all_patterns:
all_patterns[pattern].append(word)
else:
all_patterns[pattern] = [word]
with open("word_patterns.txt", "w") as out_file:
out_file.write(pprint.pformat(all_patterns))
totalTime = round(time.time() - start_time, 2)
print(f"Done! {len(all_patterns):,} word patterns found in {totalTime} seconds.")
# Done! 9,581 word patterns found in 0.58 seconds.
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | def get_1s_count(number: int) -> int:
"""
Count the number of set bits in a 32 bit integer using Brian Kernighan's way.
Ref - http://graphics.stanford.edu/~seander/bithacks.html#CountBitsSetKernighan
>>> get_1s_count(25)
3
>>> get_1s_count(37)
3
>>> get_1s_count(21)
3
>>> get_1s_count(58)
4
>>> get_1s_count(0)
0
>>> get_1s_count(256)
1
>>> get_1s_count(-1)
Traceback (most recent call last):
...
ValueError: the value of input must be positive
>>> get_1s_count(0.8)
Traceback (most recent call last):
...
TypeError: Input value must be an 'int' type
"""
if number < 0:
raise ValueError("the value of input must be positive")
elif isinstance(number, float):
raise TypeError("Input value must be an 'int' type")
count = 0
while number:
# This way we arrive at next set bit (next 1) instead of looping
# through each bit and checking for 1s hence the
# loop won't run 32 times it will only run the number of `1` times
number &= number - 1
count += 1
return count
if __name__ == "__main__":
import doctest
doctest.testmod()
| def get_1s_count(number: int) -> int:
"""
Count the number of set bits in a 32 bit integer using Brian Kernighan's way.
Ref - http://graphics.stanford.edu/~seander/bithacks.html#CountBitsSetKernighan
>>> get_1s_count(25)
3
>>> get_1s_count(37)
3
>>> get_1s_count(21)
3
>>> get_1s_count(58)
4
>>> get_1s_count(0)
0
>>> get_1s_count(256)
1
>>> get_1s_count(-1)
Traceback (most recent call last):
...
ValueError: the value of input must be positive
>>> get_1s_count(0.8)
Traceback (most recent call last):
...
TypeError: Input value must be an 'int' type
"""
if number < 0:
raise ValueError("the value of input must be positive")
elif isinstance(number, float):
raise TypeError("Input value must be an 'int' type")
count = 0
while number:
# This way we arrive at next set bit (next 1) instead of looping
# through each bit and checking for 1s hence the
# loop won't run 32 times it will only run the number of `1` times
number &= number - 1
count += 1
return count
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Implementation of finding nth fibonacci number using matrix exponentiation.
Time Complexity is about O(log(n)*8), where 8 is the complexity of matrix
multiplication of size 2 by 2.
And on the other hand complexity of bruteforce solution is O(n).
As we know
f[n] = f[n-1] + f[n-1]
Converting to matrix,
[f(n),f(n-1)] = [[1,1],[1,0]] * [f(n-1),f(n-2)]
-> [f(n),f(n-1)] = [[1,1],[1,0]]^2 * [f(n-2),f(n-3)]
...
...
-> [f(n),f(n-1)] = [[1,1],[1,0]]^(n-1) * [f(1),f(0)]
So we just need the n times multiplication of the matrix [1,1],[1,0]].
We can decrease the n times multiplication by following the divide and conquer approach.
"""
def multiply(matrix_a, matrix_b):
matrix_c = []
n = len(matrix_a)
for i in range(n):
list_1 = []
for j in range(n):
val = 0
for k in range(n):
val = val + matrix_a[i][k] * matrix_b[k][j]
list_1.append(val)
matrix_c.append(list_1)
return matrix_c
def identity(n):
return [[int(row == column) for column in range(n)] for row in range(n)]
def nth_fibonacci_matrix(n):
"""
>>> nth_fibonacci_matrix(100)
354224848179261915075
>>> nth_fibonacci_matrix(-100)
-100
"""
if n <= 1:
return n
res_matrix = identity(2)
fibonacci_matrix = [[1, 1], [1, 0]]
n = n - 1
while n > 0:
if n % 2 == 1:
res_matrix = multiply(res_matrix, fibonacci_matrix)
fibonacci_matrix = multiply(fibonacci_matrix, fibonacci_matrix)
n = int(n / 2)
return res_matrix[0][0]
def nth_fibonacci_bruteforce(n):
"""
>>> nth_fibonacci_bruteforce(100)
354224848179261915075
>>> nth_fibonacci_bruteforce(-100)
-100
"""
if n <= 1:
return n
fib0 = 0
fib1 = 1
for i in range(2, n + 1):
fib0, fib1 = fib1, fib0 + fib1
return fib1
def main():
for ordinal in "0th 1st 2nd 3rd 10th 100th 1000th".split():
n = int("".join(c for c in ordinal if c in "0123456789")) # 1000th --> 1000
print(
f"{ordinal} fibonacci number using matrix exponentiation is "
f"{nth_fibonacci_matrix(n)} and using bruteforce is "
f"{nth_fibonacci_bruteforce(n)}\n"
)
# from timeit import timeit
# print(timeit("nth_fibonacci_matrix(1000000)",
# "from main import nth_fibonacci_matrix", number=5))
# print(timeit("nth_fibonacci_bruteforce(1000000)",
# "from main import nth_fibonacci_bruteforce", number=5))
# 2.3342058970001744
# 57.256506615000035
if __name__ == "__main__":
import doctest
doctest.testmod()
main()
| """
Implementation of finding nth fibonacci number using matrix exponentiation.
Time Complexity is about O(log(n)*8), where 8 is the complexity of matrix
multiplication of size 2 by 2.
And on the other hand complexity of bruteforce solution is O(n).
As we know
f[n] = f[n-1] + f[n-1]
Converting to matrix,
[f(n),f(n-1)] = [[1,1],[1,0]] * [f(n-1),f(n-2)]
-> [f(n),f(n-1)] = [[1,1],[1,0]]^2 * [f(n-2),f(n-3)]
...
...
-> [f(n),f(n-1)] = [[1,1],[1,0]]^(n-1) * [f(1),f(0)]
So we just need the n times multiplication of the matrix [1,1],[1,0]].
We can decrease the n times multiplication by following the divide and conquer approach.
"""
def multiply(matrix_a, matrix_b):
matrix_c = []
n = len(matrix_a)
for i in range(n):
list_1 = []
for j in range(n):
val = 0
for k in range(n):
val = val + matrix_a[i][k] * matrix_b[k][j]
list_1.append(val)
matrix_c.append(list_1)
return matrix_c
def identity(n):
return [[int(row == column) for column in range(n)] for row in range(n)]
def nth_fibonacci_matrix(n):
"""
>>> nth_fibonacci_matrix(100)
354224848179261915075
>>> nth_fibonacci_matrix(-100)
-100
"""
if n <= 1:
return n
res_matrix = identity(2)
fibonacci_matrix = [[1, 1], [1, 0]]
n = n - 1
while n > 0:
if n % 2 == 1:
res_matrix = multiply(res_matrix, fibonacci_matrix)
fibonacci_matrix = multiply(fibonacci_matrix, fibonacci_matrix)
n = int(n / 2)
return res_matrix[0][0]
def nth_fibonacci_bruteforce(n):
"""
>>> nth_fibonacci_bruteforce(100)
354224848179261915075
>>> nth_fibonacci_bruteforce(-100)
-100
"""
if n <= 1:
return n
fib0 = 0
fib1 = 1
for i in range(2, n + 1):
fib0, fib1 = fib1, fib0 + fib1
return fib1
def main():
for ordinal in "0th 1st 2nd 3rd 10th 100th 1000th".split():
n = int("".join(c for c in ordinal if c in "0123456789")) # 1000th --> 1000
print(
f"{ordinal} fibonacci number using matrix exponentiation is "
f"{nth_fibonacci_matrix(n)} and using bruteforce is "
f"{nth_fibonacci_bruteforce(n)}\n"
)
# from timeit import timeit
# print(timeit("nth_fibonacci_matrix(1000000)",
# "from main import nth_fibonacci_matrix", number=5))
# print(timeit("nth_fibonacci_bruteforce(1000000)",
# "from main import nth_fibonacci_bruteforce", number=5))
# 2.3342058970001744
# 57.256506615000035
if __name__ == "__main__":
import doctest
doctest.testmod()
main()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | from __future__ import annotations
def stable_matching(
donor_pref: list[list[int]], recipient_pref: list[list[int]]
) -> list[int]:
"""
Finds the stable match in any bipartite graph, i.e a pairing where no 2 objects
prefer each other over their partner. The function accepts the preferences of
oegan donors and recipients (where both are assigned numbers from 0 to n-1) and
returns a list where the index position corresponds to the donor and value at the
index is the organ recipient.
To better understand the algorithm, see also:
https://github.com/akashvshroff/Gale_Shapley_Stable_Matching (README).
https://www.youtube.com/watch?v=Qcv1IqHWAzg&t=13s (Numberphile YouTube).
>>> donor_pref = [[0, 1, 3, 2], [0, 2, 3, 1], [1, 0, 2, 3], [0, 3, 1, 2]]
>>> recipient_pref = [[3, 1, 2, 0], [3, 1, 0, 2], [0, 3, 1, 2], [1, 0, 3, 2]]
>>> print(stable_matching(donor_pref, recipient_pref))
[1, 2, 3, 0]
"""
assert len(donor_pref) == len(recipient_pref)
n = len(donor_pref)
unmatched_donors = list(range(n))
donor_record = [-1] * n # who the donor has donated to
rec_record = [-1] * n # who the recipient has received from
num_donations = [0] * n
while unmatched_donors:
donor = unmatched_donors[0]
donor_preference = donor_pref[donor]
recipient = donor_preference[num_donations[donor]]
num_donations[donor] += 1
rec_preference = recipient_pref[recipient]
prev_donor = rec_record[recipient]
if prev_donor != -1:
if rec_preference.index(prev_donor) > rec_preference.index(donor):
rec_record[recipient] = donor
donor_record[donor] = recipient
unmatched_donors.append(prev_donor)
unmatched_donors.remove(donor)
else:
rec_record[recipient] = donor
donor_record[donor] = recipient
unmatched_donors.remove(donor)
return donor_record
| from __future__ import annotations
def stable_matching(
donor_pref: list[list[int]], recipient_pref: list[list[int]]
) -> list[int]:
"""
Finds the stable match in any bipartite graph, i.e a pairing where no 2 objects
prefer each other over their partner. The function accepts the preferences of
oegan donors and recipients (where both are assigned numbers from 0 to n-1) and
returns a list where the index position corresponds to the donor and value at the
index is the organ recipient.
To better understand the algorithm, see also:
https://github.com/akashvshroff/Gale_Shapley_Stable_Matching (README).
https://www.youtube.com/watch?v=Qcv1IqHWAzg&t=13s (Numberphile YouTube).
>>> donor_pref = [[0, 1, 3, 2], [0, 2, 3, 1], [1, 0, 2, 3], [0, 3, 1, 2]]
>>> recipient_pref = [[3, 1, 2, 0], [3, 1, 0, 2], [0, 3, 1, 2], [1, 0, 3, 2]]
>>> print(stable_matching(donor_pref, recipient_pref))
[1, 2, 3, 0]
"""
assert len(donor_pref) == len(recipient_pref)
n = len(donor_pref)
unmatched_donors = list(range(n))
donor_record = [-1] * n # who the donor has donated to
rec_record = [-1] * n # who the recipient has received from
num_donations = [0] * n
while unmatched_donors:
donor = unmatched_donors[0]
donor_preference = donor_pref[donor]
recipient = donor_preference[num_donations[donor]]
num_donations[donor] += 1
rec_preference = recipient_pref[recipient]
prev_donor = rec_record[recipient]
if prev_donor != -1:
if rec_preference.index(prev_donor) > rec_preference.index(donor):
rec_record[recipient] = donor
donor_record[donor] = recipient
unmatched_donors.append(prev_donor)
unmatched_donors.remove(donor)
else:
rec_record[recipient] = donor
donor_record[donor] = recipient
unmatched_donors.remove(donor)
return donor_record
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | #!/bin/sh
#
# An example hook script to verify what is about to be committed.
# Called by "git merge" with no arguments. The hook should
# exit with non-zero status after issuing an appropriate message to
# stderr if it wants to stop the merge commit.
#
# To enable this hook, rename this file to "pre-merge-commit".
. git-sh-setup
test -x "$GIT_DIR/hooks/pre-commit" &&
exec "$GIT_DIR/hooks/pre-commit"
:
| #!/bin/sh
#
# An example hook script to verify what is about to be committed.
# Called by "git merge" with no arguments. The hook should
# exit with non-zero status after issuing an appropriate message to
# stderr if it wants to stop the merge commit.
#
# To enable this hook, rename this file to "pre-merge-commit".
. git-sh-setup
test -x "$GIT_DIR/hooks/pre-commit" &&
exec "$GIT_DIR/hooks/pre-commit"
:
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
this is code for forecasting
but i modified it and used it for safety checker of data
for ex: you have a online shop and for some reason some data are
missing (the amount of data that u expected are not supposed to be)
then we can use it
*ps : 1. ofc we can use normal statistic method but in this case
the data is quite absurd and only a little^^
2. ofc u can use this and modified it for forecasting purpose
for the next 3 months sales or something,
u can just adjust it for ur own purpose
"""
import numpy as np
import pandas as pd
from sklearn.preprocessing import Normalizer
from sklearn.svm import SVR
from statsmodels.tsa.statespace.sarimax import SARIMAX
def linear_regression_prediction(
train_dt: list, train_usr: list, train_mtch: list, test_dt: list, test_mtch: list
) -> float:
"""
First method: linear regression
input : training data (date, total_user, total_event) in list of float
output : list of total user prediction in float
>>> n = linear_regression_prediction([2,3,4,5], [5,3,4,6], [3,1,2,4], [2,1], [2,2])
>>> abs(n - 5.0) < 1e-6 # Checking precision because of floating point errors
True
"""
x = np.array([[1, item, train_mtch[i]] for i, item in enumerate(train_dt)])
y = np.array(train_usr)
beta = np.dot(np.dot(np.linalg.inv(np.dot(x.transpose(), x)), x.transpose()), y)
return abs(beta[0] + test_dt[0] * beta[1] + test_mtch[0] + beta[2])
def sarimax_predictor(train_user: list, train_match: list, test_match: list) -> float:
"""
second method: Sarimax
sarimax is a statistic method which using previous input
and learn its pattern to predict future data
input : training data (total_user, with exog data = total_event) in list of float
output : list of total user prediction in float
>>> sarimax_predictor([4,2,6,8], [3,1,2,4], [2])
6.6666671111109626
"""
order = (1, 2, 1)
seasonal_order = (1, 1, 0, 7)
model = SARIMAX(
train_user, exog=train_match, order=order, seasonal_order=seasonal_order
)
model_fit = model.fit(disp=False, maxiter=600, method="nm")
result = model_fit.predict(1, len(test_match), exog=[test_match])
return result[0]
def support_vector_regressor(x_train: list, x_test: list, train_user: list) -> float:
"""
Third method: Support vector regressor
svr is quite the same with svm(support vector machine)
it uses the same principles as the SVM for classification,
with only a few minor differences and the only different is that
it suits better for regression purpose
input : training data (date, total_user, total_event) in list of float
where x = list of set (date and total event)
output : list of total user prediction in float
>>> support_vector_regressor([[5,2],[1,5],[6,2]], [[3,2]], [2,1,4])
1.634932078116079
"""
regressor = SVR(kernel="rbf", C=1, gamma=0.1, epsilon=0.1)
regressor.fit(x_train, train_user)
y_pred = regressor.predict(x_test)
return y_pred[0]
def interquartile_range_checker(train_user: list) -> float:
"""
Optional method: interquatile range
input : list of total user in float
output : low limit of input in float
this method can be used to check whether some data is outlier or not
>>> interquartile_range_checker([1,2,3,4,5,6,7,8,9,10])
2.8
"""
train_user.sort()
q1 = np.percentile(train_user, 25)
q3 = np.percentile(train_user, 75)
iqr = q3 - q1
low_lim = q1 - (iqr * 0.1)
return low_lim
def data_safety_checker(list_vote: list, actual_result: float) -> None:
"""
Used to review all the votes (list result prediction)
and compare it to the actual result.
input : list of predictions
output : print whether it's safe or not
>>> data_safety_checker([2,3,4],5.0)
Today's data is not safe.
"""
safe = 0
not_safe = 0
for i in list_vote:
if i > actual_result:
safe = not_safe + 1
else:
if abs(abs(i) - abs(actual_result)) <= 0.1:
safe = safe + 1
else:
not_safe = not_safe + 1
print(f"Today's data is {'not ' if safe <= not_safe else ''}safe.")
# data_input_df = pd.read_csv("ex_data.csv", header=None)
data_input = [[18231, 0.0, 1], [22621, 1.0, 2], [15675, 0.0, 3], [23583, 1.0, 4]]
data_input_df = pd.DataFrame(data_input, columns=["total_user", "total_even", "days"])
"""
data column = total user in a day, how much online event held in one day,
what day is that(sunday-saturday)
"""
# start normalization
normalize_df = Normalizer().fit_transform(data_input_df.values)
# split data
total_date = normalize_df[:, 2].tolist()
total_user = normalize_df[:, 0].tolist()
total_match = normalize_df[:, 1].tolist()
# for svr (input variable = total date and total match)
x = normalize_df[:, [1, 2]].tolist()
x_train = x[: len(x) - 1]
x_test = x[len(x) - 1 :]
# for linear reression & sarimax
trn_date = total_date[: len(total_date) - 1]
trn_user = total_user[: len(total_user) - 1]
trn_match = total_match[: len(total_match) - 1]
tst_date = total_date[len(total_date) - 1 :]
tst_user = total_user[len(total_user) - 1 :]
tst_match = total_match[len(total_match) - 1 :]
# voting system with forecasting
res_vote = []
res_vote.append(
linear_regression_prediction(trn_date, trn_user, trn_match, tst_date, tst_match)
)
res_vote.append(sarimax_predictor(trn_user, trn_match, tst_match))
res_vote.append(support_vector_regressor(x_train, x_test, trn_user))
# check the safety of todays'data^^
data_safety_checker(res_vote, tst_user)
| """
this is code for forecasting
but i modified it and used it for safety checker of data
for ex: you have a online shop and for some reason some data are
missing (the amount of data that u expected are not supposed to be)
then we can use it
*ps : 1. ofc we can use normal statistic method but in this case
the data is quite absurd and only a little^^
2. ofc u can use this and modified it for forecasting purpose
for the next 3 months sales or something,
u can just adjust it for ur own purpose
"""
import numpy as np
import pandas as pd
from sklearn.preprocessing import Normalizer
from sklearn.svm import SVR
from statsmodels.tsa.statespace.sarimax import SARIMAX
def linear_regression_prediction(
train_dt: list, train_usr: list, train_mtch: list, test_dt: list, test_mtch: list
) -> float:
"""
First method: linear regression
input : training data (date, total_user, total_event) in list of float
output : list of total user prediction in float
>>> n = linear_regression_prediction([2,3,4,5], [5,3,4,6], [3,1,2,4], [2,1], [2,2])
>>> abs(n - 5.0) < 1e-6 # Checking precision because of floating point errors
True
"""
x = np.array([[1, item, train_mtch[i]] for i, item in enumerate(train_dt)])
y = np.array(train_usr)
beta = np.dot(np.dot(np.linalg.inv(np.dot(x.transpose(), x)), x.transpose()), y)
return abs(beta[0] + test_dt[0] * beta[1] + test_mtch[0] + beta[2])
def sarimax_predictor(train_user: list, train_match: list, test_match: list) -> float:
"""
second method: Sarimax
sarimax is a statistic method which using previous input
and learn its pattern to predict future data
input : training data (total_user, with exog data = total_event) in list of float
output : list of total user prediction in float
>>> sarimax_predictor([4,2,6,8], [3,1,2,4], [2])
6.6666671111109626
"""
order = (1, 2, 1)
seasonal_order = (1, 1, 0, 7)
model = SARIMAX(
train_user, exog=train_match, order=order, seasonal_order=seasonal_order
)
model_fit = model.fit(disp=False, maxiter=600, method="nm")
result = model_fit.predict(1, len(test_match), exog=[test_match])
return result[0]
def support_vector_regressor(x_train: list, x_test: list, train_user: list) -> float:
"""
Third method: Support vector regressor
svr is quite the same with svm(support vector machine)
it uses the same principles as the SVM for classification,
with only a few minor differences and the only different is that
it suits better for regression purpose
input : training data (date, total_user, total_event) in list of float
where x = list of set (date and total event)
output : list of total user prediction in float
>>> support_vector_regressor([[5,2],[1,5],[6,2]], [[3,2]], [2,1,4])
1.634932078116079
"""
regressor = SVR(kernel="rbf", C=1, gamma=0.1, epsilon=0.1)
regressor.fit(x_train, train_user)
y_pred = regressor.predict(x_test)
return y_pred[0]
def interquartile_range_checker(train_user: list) -> float:
"""
Optional method: interquatile range
input : list of total user in float
output : low limit of input in float
this method can be used to check whether some data is outlier or not
>>> interquartile_range_checker([1,2,3,4,5,6,7,8,9,10])
2.8
"""
train_user.sort()
q1 = np.percentile(train_user, 25)
q3 = np.percentile(train_user, 75)
iqr = q3 - q1
low_lim = q1 - (iqr * 0.1)
return low_lim
def data_safety_checker(list_vote: list, actual_result: float) -> None:
"""
Used to review all the votes (list result prediction)
and compare it to the actual result.
input : list of predictions
output : print whether it's safe or not
>>> data_safety_checker([2,3,4],5.0)
Today's data is not safe.
"""
safe = 0
not_safe = 0
for i in list_vote:
if i > actual_result:
safe = not_safe + 1
else:
if abs(abs(i) - abs(actual_result)) <= 0.1:
safe = safe + 1
else:
not_safe = not_safe + 1
print(f"Today's data is {'not ' if safe <= not_safe else ''}safe.")
# data_input_df = pd.read_csv("ex_data.csv", header=None)
data_input = [[18231, 0.0, 1], [22621, 1.0, 2], [15675, 0.0, 3], [23583, 1.0, 4]]
data_input_df = pd.DataFrame(data_input, columns=["total_user", "total_even", "days"])
"""
data column = total user in a day, how much online event held in one day,
what day is that(sunday-saturday)
"""
# start normalization
normalize_df = Normalizer().fit_transform(data_input_df.values)
# split data
total_date = normalize_df[:, 2].tolist()
total_user = normalize_df[:, 0].tolist()
total_match = normalize_df[:, 1].tolist()
# for svr (input variable = total date and total match)
x = normalize_df[:, [1, 2]].tolist()
x_train = x[: len(x) - 1]
x_test = x[len(x) - 1 :]
# for linear reression & sarimax
trn_date = total_date[: len(total_date) - 1]
trn_user = total_user[: len(total_user) - 1]
trn_match = total_match[: len(total_match) - 1]
tst_date = total_date[len(total_date) - 1 :]
tst_user = total_user[len(total_user) - 1 :]
tst_match = total_match[len(total_match) - 1 :]
# voting system with forecasting
res_vote = []
res_vote.append(
linear_regression_prediction(trn_date, trn_user, trn_match, tst_date, tst_match)
)
res_vote.append(sarimax_predictor(trn_user, trn_match, tst_match))
res_vote.append(support_vector_regressor(x_train, x_test, trn_user))
# check the safety of todays'data^^
data_safety_checker(res_vote, tst_user)
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | from __future__ import annotations
def double_linear_search(array: list[int], search_item: int) -> int:
"""
Iterate through the array from both sides to find the index of search_item.
:param array: the array to be searched
:param search_item: the item to be searched
:return the index of search_item, if search_item is in array, else -1
Examples:
>>> double_linear_search([1, 5, 5, 10], 1)
0
>>> double_linear_search([1, 5, 5, 10], 5)
1
>>> double_linear_search([1, 5, 5, 10], 100)
-1
>>> double_linear_search([1, 5, 5, 10], 10)
3
"""
# define the start and end index of the given array
start_ind, end_ind = 0, len(array) - 1
while start_ind <= end_ind:
if array[start_ind] == search_item:
return start_ind
elif array[end_ind] == search_item:
return end_ind
else:
start_ind += 1
end_ind -= 1
# returns -1 if search_item is not found in array
return -1
if __name__ == "__main__":
print(double_linear_search(list(range(100)), 40))
| from __future__ import annotations
def double_linear_search(array: list[int], search_item: int) -> int:
"""
Iterate through the array from both sides to find the index of search_item.
:param array: the array to be searched
:param search_item: the item to be searched
:return the index of search_item, if search_item is in array, else -1
Examples:
>>> double_linear_search([1, 5, 5, 10], 1)
0
>>> double_linear_search([1, 5, 5, 10], 5)
1
>>> double_linear_search([1, 5, 5, 10], 100)
-1
>>> double_linear_search([1, 5, 5, 10], 10)
3
"""
# define the start and end index of the given array
start_ind, end_ind = 0, len(array) - 1
while start_ind <= end_ind:
if array[start_ind] == search_item:
return start_ind
elif array[end_ind] == search_item:
return end_ind
else:
start_ind += 1
end_ind -= 1
# returns -1 if search_item is not found in array
return -1
if __name__ == "__main__":
print(double_linear_search(list(range(100)), 40))
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | # Random Forest Classifier Example
from matplotlib import pyplot as plt
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import plot_confusion_matrix
from sklearn.model_selection import train_test_split
def main():
"""
Random Forest Classifier Example using sklearn function.
Iris type dataset is used to demonstrate algorithm.
"""
# Load Iris dataset
iris = load_iris()
# Split dataset into train and test data
X = iris["data"] # features
Y = iris["target"]
x_train, x_test, y_train, y_test = train_test_split(
X, Y, test_size=0.3, random_state=1
)
# Random Forest Classifier
rand_for = RandomForestClassifier(random_state=42, n_estimators=100)
rand_for.fit(x_train, y_train)
# Display Confusion Matrix of Classifier
plot_confusion_matrix(
rand_for,
x_test,
y_test,
display_labels=iris["target_names"],
cmap="Blues",
normalize="true",
)
plt.title("Normalized Confusion Matrix - IRIS Dataset")
plt.show()
if __name__ == "__main__":
main()
| # Random Forest Classifier Example
from matplotlib import pyplot as plt
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import plot_confusion_matrix
from sklearn.model_selection import train_test_split
def main():
"""
Random Forest Classifier Example using sklearn function.
Iris type dataset is used to demonstrate algorithm.
"""
# Load Iris dataset
iris = load_iris()
# Split dataset into train and test data
X = iris["data"] # features
Y = iris["target"]
x_train, x_test, y_train, y_test = train_test_split(
X, Y, test_size=0.3, random_state=1
)
# Random Forest Classifier
rand_for = RandomForestClassifier(random_state=42, n_estimators=100)
rand_for.fit(x_train, y_train)
# Display Confusion Matrix of Classifier
plot_confusion_matrix(
rand_for,
x_test,
y_test,
display_labels=iris["target_names"],
cmap="Blues",
normalize="true",
)
plt.title("Normalized Confusion Matrix - IRIS Dataset")
plt.show()
if __name__ == "__main__":
main()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
The Reverse Polish Nation also known as Polish postfix notation
or simply postfix notation.
https://en.wikipedia.org/wiki/Reverse_Polish_notation
Classic examples of simple stack implementations
Valid operators are +, -, *, /.
Each operand may be an integer or another expression.
"""
from __future__ import annotations
from typing import Any
def evaluate_postfix(postfix_notation: list) -> int:
"""
>>> evaluate_postfix(["2", "1", "+", "3", "*"])
9
>>> evaluate_postfix(["4", "13", "5", "/", "+"])
6
>>> evaluate_postfix([])
0
"""
if not postfix_notation:
return 0
operations = {"+", "-", "*", "/"}
stack: list[Any] = []
for token in postfix_notation:
if token in operations:
b, a = stack.pop(), stack.pop()
if token == "+":
stack.append(a + b)
elif token == "-":
stack.append(a - b)
elif token == "*":
stack.append(a * b)
else:
if a * b < 0 and a % b != 0:
stack.append(a // b + 1)
else:
stack.append(a // b)
else:
stack.append(int(token))
return stack.pop()
if __name__ == "__main__":
import doctest
doctest.testmod()
| """
The Reverse Polish Nation also known as Polish postfix notation
or simply postfix notation.
https://en.wikipedia.org/wiki/Reverse_Polish_notation
Classic examples of simple stack implementations
Valid operators are +, -, *, /.
Each operand may be an integer or another expression.
"""
from __future__ import annotations
from typing import Any
def evaluate_postfix(postfix_notation: list) -> int:
"""
>>> evaluate_postfix(["2", "1", "+", "3", "*"])
9
>>> evaluate_postfix(["4", "13", "5", "/", "+"])
6
>>> evaluate_postfix([])
0
"""
if not postfix_notation:
return 0
operations = {"+", "-", "*", "/"}
stack: list[Any] = []
for token in postfix_notation:
if token in operations:
b, a = stack.pop(), stack.pop()
if token == "+":
stack.append(a + b)
elif token == "-":
stack.append(a - b)
elif token == "*":
stack.append(a * b)
else:
if a * b < 0 and a % b != 0:
stack.append(a // b + 1)
else:
stack.append(a // b)
else:
stack.append(int(token))
return stack.pop()
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | from __future__ import annotations
def median(nums: list) -> int | float:
"""
Find median of a list of numbers.
Wiki: https://en.wikipedia.org/wiki/Median
>>> median([0])
0
>>> median([4, 1, 3, 2])
2.5
>>> median([2, 70, 6, 50, 20, 8, 4])
8
Args:
nums: List of nums
Returns:
Median.
"""
sorted_list = sorted(nums)
length = len(sorted_list)
mid_index = length >> 1
return (
(sorted_list[mid_index] + sorted_list[mid_index - 1]) / 2
if length % 2 == 0
else sorted_list[mid_index]
)
def main():
import doctest
doctest.testmod()
if __name__ == "__main__":
main()
| from __future__ import annotations
def median(nums: list) -> int | float:
"""
Find median of a list of numbers.
Wiki: https://en.wikipedia.org/wiki/Median
>>> median([0])
0
>>> median([4, 1, 3, 2])
2.5
>>> median([2, 70, 6, 50, 20, 8, 4])
8
Args:
nums: List of nums
Returns:
Median.
"""
sorted_list = sorted(nums)
length = len(sorted_list)
mid_index = length >> 1
return (
(sorted_list[mid_index] + sorted_list[mid_index - 1]) / 2
if length % 2 == 0
else sorted_list[mid_index]
)
def main():
import doctest
doctest.testmod()
if __name__ == "__main__":
main()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Implementation of Bilateral filter
Inputs:
img: A 2d image with values in between 0 and 1
varS: variance in space dimension.
varI: variance in Intensity.
N: Kernel size(Must be an odd number)
Output:
img:A 2d zero padded image with values in between 0 and 1
"""
import math
import sys
import cv2
import numpy as np
def vec_gaussian(img: np.ndarray, variance: float) -> np.ndarray:
# For applying gaussian function for each element in matrix.
sigma = math.sqrt(variance)
cons = 1 / (sigma * math.sqrt(2 * math.pi))
return cons * np.exp(-((img / sigma) ** 2) * 0.5)
def get_slice(img: np.ndarray, x: int, y: int, kernel_size: int) -> np.ndarray:
half = kernel_size // 2
return img[x - half : x + half + 1, y - half : y + half + 1]
def get_gauss_kernel(kernel_size: int, spatial_variance: float) -> np.ndarray:
# Creates a gaussian kernel of given dimension.
arr = np.zeros((kernel_size, kernel_size))
for i in range(0, kernel_size):
for j in range(0, kernel_size):
arr[i, j] = math.sqrt(
abs(i - kernel_size // 2) ** 2 + abs(j - kernel_size // 2) ** 2
)
return vec_gaussian(arr, spatial_variance)
def bilateral_filter(
img: np.ndarray,
spatial_variance: float,
intensity_variance: float,
kernel_size: int,
) -> np.ndarray:
img2 = np.zeros(img.shape)
gaussKer = get_gauss_kernel(kernel_size, spatial_variance)
sizeX, sizeY = img.shape
for i in range(kernel_size // 2, sizeX - kernel_size // 2):
for j in range(kernel_size // 2, sizeY - kernel_size // 2):
imgS = get_slice(img, i, j, kernel_size)
imgI = imgS - imgS[kernel_size // 2, kernel_size // 2]
imgIG = vec_gaussian(imgI, intensity_variance)
weights = np.multiply(gaussKer, imgIG)
vals = np.multiply(imgS, weights)
val = np.sum(vals) / np.sum(weights)
img2[i, j] = val
return img2
def parse_args(args: list) -> tuple:
filename = args[1] if args[1:] else "../image_data/lena.jpg"
spatial_variance = float(args[2]) if args[2:] else 1.0
intensity_variance = float(args[3]) if args[3:] else 1.0
if args[4:]:
kernel_size = int(args[4])
kernel_size = kernel_size + abs(kernel_size % 2 - 1)
else:
kernel_size = 5
return filename, spatial_variance, intensity_variance, kernel_size
if __name__ == "__main__":
filename, spatial_variance, intensity_variance, kernel_size = parse_args(sys.argv)
img = cv2.imread(filename, 0)
cv2.imshow("input image", img)
out = img / 255
out = out.astype("float32")
out = bilateral_filter(out, spatial_variance, intensity_variance, kernel_size)
out = out * 255
out = np.uint8(out)
cv2.imshow("output image", out)
cv2.waitKey(0)
cv2.destroyAllWindows()
| """
Implementation of Bilateral filter
Inputs:
img: A 2d image with values in between 0 and 1
varS: variance in space dimension.
varI: variance in Intensity.
N: Kernel size(Must be an odd number)
Output:
img:A 2d zero padded image with values in between 0 and 1
"""
import math
import sys
import cv2
import numpy as np
def vec_gaussian(img: np.ndarray, variance: float) -> np.ndarray:
# For applying gaussian function for each element in matrix.
sigma = math.sqrt(variance)
cons = 1 / (sigma * math.sqrt(2 * math.pi))
return cons * np.exp(-((img / sigma) ** 2) * 0.5)
def get_slice(img: np.ndarray, x: int, y: int, kernel_size: int) -> np.ndarray:
half = kernel_size // 2
return img[x - half : x + half + 1, y - half : y + half + 1]
def get_gauss_kernel(kernel_size: int, spatial_variance: float) -> np.ndarray:
# Creates a gaussian kernel of given dimension.
arr = np.zeros((kernel_size, kernel_size))
for i in range(0, kernel_size):
for j in range(0, kernel_size):
arr[i, j] = math.sqrt(
abs(i - kernel_size // 2) ** 2 + abs(j - kernel_size // 2) ** 2
)
return vec_gaussian(arr, spatial_variance)
def bilateral_filter(
img: np.ndarray,
spatial_variance: float,
intensity_variance: float,
kernel_size: int,
) -> np.ndarray:
img2 = np.zeros(img.shape)
gaussKer = get_gauss_kernel(kernel_size, spatial_variance)
sizeX, sizeY = img.shape
for i in range(kernel_size // 2, sizeX - kernel_size // 2):
for j in range(kernel_size // 2, sizeY - kernel_size // 2):
imgS = get_slice(img, i, j, kernel_size)
imgI = imgS - imgS[kernel_size // 2, kernel_size // 2]
imgIG = vec_gaussian(imgI, intensity_variance)
weights = np.multiply(gaussKer, imgIG)
vals = np.multiply(imgS, weights)
val = np.sum(vals) / np.sum(weights)
img2[i, j] = val
return img2
def parse_args(args: list) -> tuple:
filename = args[1] if args[1:] else "../image_data/lena.jpg"
spatial_variance = float(args[2]) if args[2:] else 1.0
intensity_variance = float(args[3]) if args[3:] else 1.0
if args[4:]:
kernel_size = int(args[4])
kernel_size = kernel_size + abs(kernel_size % 2 - 1)
else:
kernel_size = 5
return filename, spatial_variance, intensity_variance, kernel_size
if __name__ == "__main__":
filename, spatial_variance, intensity_variance, kernel_size = parse_args(sys.argv)
img = cv2.imread(filename, 0)
cv2.imshow("input image", img)
out = img / 255
out = out.astype("float32")
out = bilateral_filter(out, spatial_variance, intensity_variance, kernel_size)
out = out * 255
out = np.uint8(out)
cv2.imshow("output image", out)
cv2.waitKey(0)
cv2.destroyAllWindows()
| -1 |
TheAlgorithms/Python | 6,258 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler` | ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | ngiachou | "2022-07-20T00:10:06Z" | "2022-09-14T08:40:04Z" | 81e30fd33c91bc37bc3baf54c42d1b192ecf41a6 | 2104fa7aebe8d76b2b2b2c47fe7e2ee615a05df6 | Unify `O(sqrt(N))` `is_prime` functions under `project_euler`. ### Describe your change:
I changed the implementation of is_prime functions inside project_euler in order to have a unified implementation. There are some cases where the solution uses the Eratosthenes' sieve method. In some cases there is no specific gain from using that method so I changed it to the O(sqrt(n)) algorithm, but in other cases the sieve method is used in the core structure of the solution, hence I did not touch those.
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [x] All new Python files are placed inside an existing directory.
* [x] All filenames are in all lowercase characters with no spaces or dashes.
* [x] All functions and variable names follow Python naming conventions.
* [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
fixes #5434 | """
Implementation of double ended queue.
"""
from __future__ import annotations
from collections.abc import Iterable
from dataclasses import dataclass
from typing import Any
class Deque:
"""
Deque data structure.
Operations
----------
append(val: Any) -> None
appendleft(val: Any) -> None
extend(iter: Iterable) -> None
extendleft(iter: Iterable) -> None
pop() -> Any
popleft() -> Any
Observers
---------
is_empty() -> bool
Attributes
----------
_front: _Node
front of the deque a.k.a. the first element
_back: _Node
back of the element a.k.a. the last element
_len: int
the number of nodes
"""
__slots__ = ["_front", "_back", "_len"]
@dataclass
class _Node:
"""
Representation of a node.
Contains a value and a pointer to the next node as well as to the previous one.
"""
val: Any = None
next: Deque._Node | None = None
prev: Deque._Node | None = None
class _Iterator:
"""
Helper class for iteration. Will be used to implement iteration.
Attributes
----------
_cur: _Node
the current node of the iteration.
"""
__slots__ = ["_cur"]
def __init__(self, cur: Deque._Node | None) -> None:
self._cur = cur
def __iter__(self) -> Deque._Iterator:
"""
>>> our_deque = Deque([1, 2, 3])
>>> iterator = iter(our_deque)
"""
return self
def __next__(self) -> Any:
"""
>>> our_deque = Deque([1, 2, 3])
>>> iterator = iter(our_deque)
>>> next(iterator)
1
>>> next(iterator)
2
>>> next(iterator)
3
"""
if self._cur is None:
# finished iterating
raise StopIteration
val = self._cur.val
self._cur = self._cur.next
return val
def __init__(self, iterable: Iterable[Any] | None = None) -> None:
self._front: Any = None
self._back: Any = None
self._len: int = 0
if iterable is not None:
# append every value to the deque
for val in iterable:
self.append(val)
def append(self, val: Any) -> None:
"""
Adds val to the end of the deque.
Time complexity: O(1)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_1.append(4)
>>> our_deque_1
[1, 2, 3, 4]
>>> our_deque_2 = Deque('ab')
>>> our_deque_2.append('c')
>>> our_deque_2
['a', 'b', 'c']
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_1.append(4)
>>> deque_collections_1
deque([1, 2, 3, 4])
>>> deque_collections_2 = deque('ab')
>>> deque_collections_2.append('c')
>>> deque_collections_2
deque(['a', 'b', 'c'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
node = self._Node(val, None, None)
if self.is_empty():
# front = back
self._front = self._back = node
self._len = 1
else:
# connect nodes
self._back.next = node
node.prev = self._back
self._back = node # assign new back to the new node
self._len += 1
# make sure there were no errors
assert not self.is_empty(), "Error on appending value."
def appendleft(self, val: Any) -> None:
"""
Adds val to the beginning of the deque.
Time complexity: O(1)
>>> our_deque_1 = Deque([2, 3])
>>> our_deque_1.appendleft(1)
>>> our_deque_1
[1, 2, 3]
>>> our_deque_2 = Deque('bc')
>>> our_deque_2.appendleft('a')
>>> our_deque_2
['a', 'b', 'c']
>>> from collections import deque
>>> deque_collections_1 = deque([2, 3])
>>> deque_collections_1.appendleft(1)
>>> deque_collections_1
deque([1, 2, 3])
>>> deque_collections_2 = deque('bc')
>>> deque_collections_2.appendleft('a')
>>> deque_collections_2
deque(['a', 'b', 'c'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
node = self._Node(val, None, None)
if self.is_empty():
# front = back
self._front = self._back = node
self._len = 1
else:
# connect nodes
node.next = self._front
self._front.prev = node
self._front = node # assign new front to the new node
self._len += 1
# make sure there were no errors
assert not self.is_empty(), "Error on appending value."
def extend(self, iter: Iterable[Any]) -> None:
"""
Appends every value of iter to the end of the deque.
Time complexity: O(n)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_1.extend([4, 5])
>>> our_deque_1
[1, 2, 3, 4, 5]
>>> our_deque_2 = Deque('ab')
>>> our_deque_2.extend('cd')
>>> our_deque_2
['a', 'b', 'c', 'd']
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_1.extend([4, 5])
>>> deque_collections_1
deque([1, 2, 3, 4, 5])
>>> deque_collections_2 = deque('ab')
>>> deque_collections_2.extend('cd')
>>> deque_collections_2
deque(['a', 'b', 'c', 'd'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
for val in iter:
self.append(val)
def extendleft(self, iter: Iterable[Any]) -> None:
"""
Appends every value of iter to the beginning of the deque.
Time complexity: O(n)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_1.extendleft([0, -1])
>>> our_deque_1
[-1, 0, 1, 2, 3]
>>> our_deque_2 = Deque('cd')
>>> our_deque_2.extendleft('ba')
>>> our_deque_2
['a', 'b', 'c', 'd']
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_1.extendleft([0, -1])
>>> deque_collections_1
deque([-1, 0, 1, 2, 3])
>>> deque_collections_2 = deque('cd')
>>> deque_collections_2.extendleft('ba')
>>> deque_collections_2
deque(['a', 'b', 'c', 'd'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
for val in iter:
self.appendleft(val)
def pop(self) -> Any:
"""
Removes the last element of the deque and returns it.
Time complexity: O(1)
@returns topop.val: the value of the node to pop.
>>> our_deque = Deque([1, 2, 3, 15182])
>>> our_popped = our_deque.pop()
>>> our_popped
15182
>>> our_deque
[1, 2, 3]
>>> from collections import deque
>>> deque_collections = deque([1, 2, 3, 15182])
>>> collections_popped = deque_collections.pop()
>>> collections_popped
15182
>>> deque_collections
deque([1, 2, 3])
>>> list(our_deque) == list(deque_collections)
True
>>> our_popped == collections_popped
True
"""
# make sure the deque has elements to pop
assert not self.is_empty(), "Deque is empty."
topop = self._back
self._back = self._back.prev # set new back
self._back.next = (
None # drop the last node - python will deallocate memory automatically
)
self._len -= 1
return topop.val
def popleft(self) -> Any:
"""
Removes the first element of the deque and returns it.
Time complexity: O(1)
@returns topop.val: the value of the node to pop.
>>> our_deque = Deque([15182, 1, 2, 3])
>>> our_popped = our_deque.popleft()
>>> our_popped
15182
>>> our_deque
[1, 2, 3]
>>> from collections import deque
>>> deque_collections = deque([15182, 1, 2, 3])
>>> collections_popped = deque_collections.popleft()
>>> collections_popped
15182
>>> deque_collections
deque([1, 2, 3])
>>> list(our_deque) == list(deque_collections)
True
>>> our_popped == collections_popped
True
"""
# make sure the deque has elements to pop
assert not self.is_empty(), "Deque is empty."
topop = self._front
self._front = self._front.next # set new front and drop the first node
self._front.prev = None
self._len -= 1
return topop.val
def is_empty(self) -> bool:
"""
Checks if the deque is empty.
Time complexity: O(1)
>>> our_deque = Deque([1, 2, 3])
>>> our_deque.is_empty()
False
>>> our_empty_deque = Deque()
>>> our_empty_deque.is_empty()
True
>>> from collections import deque
>>> empty_deque_collections = deque()
>>> list(our_empty_deque) == list(empty_deque_collections)
True
"""
return self._front is None
def __len__(self) -> int:
"""
Implements len() function. Returns the length of the deque.
Time complexity: O(1)
>>> our_deque = Deque([1, 2, 3])
>>> len(our_deque)
3
>>> our_empty_deque = Deque()
>>> len(our_empty_deque)
0
>>> from collections import deque
>>> deque_collections = deque([1, 2, 3])
>>> len(deque_collections)
3
>>> empty_deque_collections = deque()
>>> len(empty_deque_collections)
0
>>> len(our_empty_deque) == len(empty_deque_collections)
True
"""
return self._len
def __eq__(self, other: object) -> bool:
"""
Implements "==" operator. Returns if *self* is equal to *other*.
Time complexity: O(n)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_2 = Deque([1, 2, 3])
>>> our_deque_1 == our_deque_2
True
>>> our_deque_3 = Deque([1, 2])
>>> our_deque_1 == our_deque_3
False
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_2 = deque([1, 2, 3])
>>> deque_collections_1 == deque_collections_2
True
>>> deque_collections_3 = deque([1, 2])
>>> deque_collections_1 == deque_collections_3
False
>>> (our_deque_1 == our_deque_2) == (deque_collections_1 == deque_collections_2)
True
>>> (our_deque_1 == our_deque_3) == (deque_collections_1 == deque_collections_3)
True
"""
if not isinstance(other, Deque):
return NotImplemented
me = self._front
oth = other._front
# if the length of the deques are not the same, they are not equal
if len(self) != len(other):
return False
while me is not None and oth is not None:
# compare every value
if me.val != oth.val:
return False
me = me.next
oth = oth.next
return True
def __iter__(self) -> Deque._Iterator:
"""
Implements iteration.
Time complexity: O(1)
>>> our_deque = Deque([1, 2, 3])
>>> for v in our_deque:
... print(v)
1
2
3
>>> from collections import deque
>>> deque_collections = deque([1, 2, 3])
>>> for v in deque_collections:
... print(v)
1
2
3
"""
return Deque._Iterator(self._front)
def __repr__(self) -> str:
"""
Implements representation of the deque.
Represents it as a list, with its values between '[' and ']'.
Time complexity: O(n)
>>> our_deque = Deque([1, 2, 3])
>>> our_deque
[1, 2, 3]
"""
values_list = []
aux = self._front
while aux is not None:
# append the values in a list to display
values_list.append(aux.val)
aux = aux.next
return "[" + ", ".join(repr(val) for val in values_list) + "]"
if __name__ == "__main__":
import doctest
doctest.testmod()
| """
Implementation of double ended queue.
"""
from __future__ import annotations
from collections.abc import Iterable
from dataclasses import dataclass
from typing import Any
class Deque:
"""
Deque data structure.
Operations
----------
append(val: Any) -> None
appendleft(val: Any) -> None
extend(iter: Iterable) -> None
extendleft(iter: Iterable) -> None
pop() -> Any
popleft() -> Any
Observers
---------
is_empty() -> bool
Attributes
----------
_front: _Node
front of the deque a.k.a. the first element
_back: _Node
back of the element a.k.a. the last element
_len: int
the number of nodes
"""
__slots__ = ["_front", "_back", "_len"]
@dataclass
class _Node:
"""
Representation of a node.
Contains a value and a pointer to the next node as well as to the previous one.
"""
val: Any = None
next: Deque._Node | None = None
prev: Deque._Node | None = None
class _Iterator:
"""
Helper class for iteration. Will be used to implement iteration.
Attributes
----------
_cur: _Node
the current node of the iteration.
"""
__slots__ = ["_cur"]
def __init__(self, cur: Deque._Node | None) -> None:
self._cur = cur
def __iter__(self) -> Deque._Iterator:
"""
>>> our_deque = Deque([1, 2, 3])
>>> iterator = iter(our_deque)
"""
return self
def __next__(self) -> Any:
"""
>>> our_deque = Deque([1, 2, 3])
>>> iterator = iter(our_deque)
>>> next(iterator)
1
>>> next(iterator)
2
>>> next(iterator)
3
"""
if self._cur is None:
# finished iterating
raise StopIteration
val = self._cur.val
self._cur = self._cur.next
return val
def __init__(self, iterable: Iterable[Any] | None = None) -> None:
self._front: Any = None
self._back: Any = None
self._len: int = 0
if iterable is not None:
# append every value to the deque
for val in iterable:
self.append(val)
def append(self, val: Any) -> None:
"""
Adds val to the end of the deque.
Time complexity: O(1)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_1.append(4)
>>> our_deque_1
[1, 2, 3, 4]
>>> our_deque_2 = Deque('ab')
>>> our_deque_2.append('c')
>>> our_deque_2
['a', 'b', 'c']
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_1.append(4)
>>> deque_collections_1
deque([1, 2, 3, 4])
>>> deque_collections_2 = deque('ab')
>>> deque_collections_2.append('c')
>>> deque_collections_2
deque(['a', 'b', 'c'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
node = self._Node(val, None, None)
if self.is_empty():
# front = back
self._front = self._back = node
self._len = 1
else:
# connect nodes
self._back.next = node
node.prev = self._back
self._back = node # assign new back to the new node
self._len += 1
# make sure there were no errors
assert not self.is_empty(), "Error on appending value."
def appendleft(self, val: Any) -> None:
"""
Adds val to the beginning of the deque.
Time complexity: O(1)
>>> our_deque_1 = Deque([2, 3])
>>> our_deque_1.appendleft(1)
>>> our_deque_1
[1, 2, 3]
>>> our_deque_2 = Deque('bc')
>>> our_deque_2.appendleft('a')
>>> our_deque_2
['a', 'b', 'c']
>>> from collections import deque
>>> deque_collections_1 = deque([2, 3])
>>> deque_collections_1.appendleft(1)
>>> deque_collections_1
deque([1, 2, 3])
>>> deque_collections_2 = deque('bc')
>>> deque_collections_2.appendleft('a')
>>> deque_collections_2
deque(['a', 'b', 'c'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
node = self._Node(val, None, None)
if self.is_empty():
# front = back
self._front = self._back = node
self._len = 1
else:
# connect nodes
node.next = self._front
self._front.prev = node
self._front = node # assign new front to the new node
self._len += 1
# make sure there were no errors
assert not self.is_empty(), "Error on appending value."
def extend(self, iter: Iterable[Any]) -> None:
"""
Appends every value of iter to the end of the deque.
Time complexity: O(n)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_1.extend([4, 5])
>>> our_deque_1
[1, 2, 3, 4, 5]
>>> our_deque_2 = Deque('ab')
>>> our_deque_2.extend('cd')
>>> our_deque_2
['a', 'b', 'c', 'd']
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_1.extend([4, 5])
>>> deque_collections_1
deque([1, 2, 3, 4, 5])
>>> deque_collections_2 = deque('ab')
>>> deque_collections_2.extend('cd')
>>> deque_collections_2
deque(['a', 'b', 'c', 'd'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
for val in iter:
self.append(val)
def extendleft(self, iter: Iterable[Any]) -> None:
"""
Appends every value of iter to the beginning of the deque.
Time complexity: O(n)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_1.extendleft([0, -1])
>>> our_deque_1
[-1, 0, 1, 2, 3]
>>> our_deque_2 = Deque('cd')
>>> our_deque_2.extendleft('ba')
>>> our_deque_2
['a', 'b', 'c', 'd']
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_1.extendleft([0, -1])
>>> deque_collections_1
deque([-1, 0, 1, 2, 3])
>>> deque_collections_2 = deque('cd')
>>> deque_collections_2.extendleft('ba')
>>> deque_collections_2
deque(['a', 'b', 'c', 'd'])
>>> list(our_deque_1) == list(deque_collections_1)
True
>>> list(our_deque_2) == list(deque_collections_2)
True
"""
for val in iter:
self.appendleft(val)
def pop(self) -> Any:
"""
Removes the last element of the deque and returns it.
Time complexity: O(1)
@returns topop.val: the value of the node to pop.
>>> our_deque = Deque([1, 2, 3, 15182])
>>> our_popped = our_deque.pop()
>>> our_popped
15182
>>> our_deque
[1, 2, 3]
>>> from collections import deque
>>> deque_collections = deque([1, 2, 3, 15182])
>>> collections_popped = deque_collections.pop()
>>> collections_popped
15182
>>> deque_collections
deque([1, 2, 3])
>>> list(our_deque) == list(deque_collections)
True
>>> our_popped == collections_popped
True
"""
# make sure the deque has elements to pop
assert not self.is_empty(), "Deque is empty."
topop = self._back
self._back = self._back.prev # set new back
self._back.next = (
None # drop the last node - python will deallocate memory automatically
)
self._len -= 1
return topop.val
def popleft(self) -> Any:
"""
Removes the first element of the deque and returns it.
Time complexity: O(1)
@returns topop.val: the value of the node to pop.
>>> our_deque = Deque([15182, 1, 2, 3])
>>> our_popped = our_deque.popleft()
>>> our_popped
15182
>>> our_deque
[1, 2, 3]
>>> from collections import deque
>>> deque_collections = deque([15182, 1, 2, 3])
>>> collections_popped = deque_collections.popleft()
>>> collections_popped
15182
>>> deque_collections
deque([1, 2, 3])
>>> list(our_deque) == list(deque_collections)
True
>>> our_popped == collections_popped
True
"""
# make sure the deque has elements to pop
assert not self.is_empty(), "Deque is empty."
topop = self._front
self._front = self._front.next # set new front and drop the first node
self._front.prev = None
self._len -= 1
return topop.val
def is_empty(self) -> bool:
"""
Checks if the deque is empty.
Time complexity: O(1)
>>> our_deque = Deque([1, 2, 3])
>>> our_deque.is_empty()
False
>>> our_empty_deque = Deque()
>>> our_empty_deque.is_empty()
True
>>> from collections import deque
>>> empty_deque_collections = deque()
>>> list(our_empty_deque) == list(empty_deque_collections)
True
"""
return self._front is None
def __len__(self) -> int:
"""
Implements len() function. Returns the length of the deque.
Time complexity: O(1)
>>> our_deque = Deque([1, 2, 3])
>>> len(our_deque)
3
>>> our_empty_deque = Deque()
>>> len(our_empty_deque)
0
>>> from collections import deque
>>> deque_collections = deque([1, 2, 3])
>>> len(deque_collections)
3
>>> empty_deque_collections = deque()
>>> len(empty_deque_collections)
0
>>> len(our_empty_deque) == len(empty_deque_collections)
True
"""
return self._len
def __eq__(self, other: object) -> bool:
"""
Implements "==" operator. Returns if *self* is equal to *other*.
Time complexity: O(n)
>>> our_deque_1 = Deque([1, 2, 3])
>>> our_deque_2 = Deque([1, 2, 3])
>>> our_deque_1 == our_deque_2
True
>>> our_deque_3 = Deque([1, 2])
>>> our_deque_1 == our_deque_3
False
>>> from collections import deque
>>> deque_collections_1 = deque([1, 2, 3])
>>> deque_collections_2 = deque([1, 2, 3])
>>> deque_collections_1 == deque_collections_2
True
>>> deque_collections_3 = deque([1, 2])
>>> deque_collections_1 == deque_collections_3
False
>>> (our_deque_1 == our_deque_2) == (deque_collections_1 == deque_collections_2)
True
>>> (our_deque_1 == our_deque_3) == (deque_collections_1 == deque_collections_3)
True
"""
if not isinstance(other, Deque):
return NotImplemented
me = self._front
oth = other._front
# if the length of the deques are not the same, they are not equal
if len(self) != len(other):
return False
while me is not None and oth is not None:
# compare every value
if me.val != oth.val:
return False
me = me.next
oth = oth.next
return True
def __iter__(self) -> Deque._Iterator:
"""
Implements iteration.
Time complexity: O(1)
>>> our_deque = Deque([1, 2, 3])
>>> for v in our_deque:
... print(v)
1
2
3
>>> from collections import deque
>>> deque_collections = deque([1, 2, 3])
>>> for v in deque_collections:
... print(v)
1
2
3
"""
return Deque._Iterator(self._front)
def __repr__(self) -> str:
"""
Implements representation of the deque.
Represents it as a list, with its values between '[' and ']'.
Time complexity: O(n)
>>> our_deque = Deque([1, 2, 3])
>>> our_deque
[1, 2, 3]
"""
values_list = []
aux = self._front
while aux is not None:
# append the values in a list to display
values_list.append(aux.val)
aux = aux.next
return "[" + ", ".join(repr(val) for val in values_list) + "]"
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| name: pre-commit
on: [push, pull_request]
jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/cache@v3
with:
path: |
~/.cache/pre-commit
~/.cache/pip
key: ${{ runner.os }}-pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
- uses: actions/setup-python@v4
with:
python-version: 3.x
# - uses: psf/[email protected]
- name: Install pre-commit
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pre-commit
- run: pre-commit run --verbose --all-files --show-diff-on-failure
| name: pre-commit
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/cache@v3
with:
path: |
~/.cache/pre-commit
~/.cache/pip
key: ${{ runner.os }}-pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
- uses: actions/setup-python@v4
with:
python-version: 3.x
# - uses: psf/[email protected]
- name: Install pre-commit
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pre-commit
- run: pre-commit run --verbose --all-files --show-diff-on-failure
| 1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
|
## Arithmetic Analysis
* [Bisection](arithmetic_analysis/bisection.py)
* [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py)
* [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py)
* [Intersection](arithmetic_analysis/intersection.py)
* [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py)
* [Lu Decomposition](arithmetic_analysis/lu_decomposition.py)
* [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py)
* [Newton Method](arithmetic_analysis/newton_method.py)
* [Newton Raphson](arithmetic_analysis/newton_raphson.py)
* [Secant Method](arithmetic_analysis/secant_method.py)
## Audio Filters
* [Butterworth Filter](audio_filters/butterworth_filter.py)
* [Iir Filter](audio_filters/iir_filter.py)
* [Show Response](audio_filters/show_response.py)
## Backtracking
* [All Combinations](backtracking/all_combinations.py)
* [All Permutations](backtracking/all_permutations.py)
* [All Subsequences](backtracking/all_subsequences.py)
* [Coloring](backtracking/coloring.py)
* [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py)
* [Knight Tour](backtracking/knight_tour.py)
* [Minimax](backtracking/minimax.py)
* [N Queens](backtracking/n_queens.py)
* [N Queens Math](backtracking/n_queens_math.py)
* [Rat In Maze](backtracking/rat_in_maze.py)
* [Sudoku](backtracking/sudoku.py)
* [Sum Of Subsets](backtracking/sum_of_subsets.py)
## Bit Manipulation
* [Binary And Operator](bit_manipulation/binary_and_operator.py)
* [Binary Count Setbits](bit_manipulation/binary_count_setbits.py)
* [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py)
* [Binary Or Operator](bit_manipulation/binary_or_operator.py)
* [Binary Shifts](bit_manipulation/binary_shifts.py)
* [Binary Twos Complement](bit_manipulation/binary_twos_complement.py)
* [Binary Xor Operator](bit_manipulation/binary_xor_operator.py)
* [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py)
* [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py)
* [Gray Code Sequence](bit_manipulation/gray_code_sequence.py)
* [Reverse Bits](bit_manipulation/reverse_bits.py)
* [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py)
## Blockchain
* [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py)
* [Diophantine Equation](blockchain/diophantine_equation.py)
* [Modular Division](blockchain/modular_division.py)
## Boolean Algebra
* [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py)
## Cellular Automata
* [Conways Game Of Life](cellular_automata/conways_game_of_life.py)
* [Game Of Life](cellular_automata/game_of_life.py)
* [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py)
* [One Dimensional](cellular_automata/one_dimensional.py)
## Ciphers
* [A1Z26](ciphers/a1z26.py)
* [Affine Cipher](ciphers/affine_cipher.py)
* [Atbash](ciphers/atbash.py)
* [Baconian Cipher](ciphers/baconian_cipher.py)
* [Base16](ciphers/base16.py)
* [Base32](ciphers/base32.py)
* [Base64](ciphers/base64.py)
* [Base85](ciphers/base85.py)
* [Beaufort Cipher](ciphers/beaufort_cipher.py)
* [Bifid](ciphers/bifid.py)
* [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py)
* [Caesar Cipher](ciphers/caesar_cipher.py)
* [Cryptomath Module](ciphers/cryptomath_module.py)
* [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py)
* [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py)
* [Diffie](ciphers/diffie.py)
* [Diffie Hellman](ciphers/diffie_hellman.py)
* [Elgamal Key Generator](ciphers/elgamal_key_generator.py)
* [Enigma Machine2](ciphers/enigma_machine2.py)
* [Hill Cipher](ciphers/hill_cipher.py)
* [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py)
* [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py)
* [Morse Code](ciphers/morse_code.py)
* [Onepad Cipher](ciphers/onepad_cipher.py)
* [Playfair Cipher](ciphers/playfair_cipher.py)
* [Polybius](ciphers/polybius.py)
* [Porta Cipher](ciphers/porta_cipher.py)
* [Rabin Miller](ciphers/rabin_miller.py)
* [Rail Fence Cipher](ciphers/rail_fence_cipher.py)
* [Rot13](ciphers/rot13.py)
* [Rsa Cipher](ciphers/rsa_cipher.py)
* [Rsa Factorization](ciphers/rsa_factorization.py)
* [Rsa Key Generator](ciphers/rsa_key_generator.py)
* [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py)
* [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py)
* [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py)
* [Trafid Cipher](ciphers/trafid_cipher.py)
* [Transposition Cipher](ciphers/transposition_cipher.py)
* [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py)
* [Vigenere Cipher](ciphers/vigenere_cipher.py)
* [Xor Cipher](ciphers/xor_cipher.py)
## Compression
* [Burrows Wheeler](compression/burrows_wheeler.py)
* [Huffman](compression/huffman.py)
* [Lempel Ziv](compression/lempel_ziv.py)
* [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py)
* [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py)
## Computer Vision
* [Cnn Classification](computer_vision/cnn_classification.py)
* [Flip Augmentation](computer_vision/flip_augmentation.py)
* [Harris Corner](computer_vision/harris_corner.py)
* [Horn Schunck](computer_vision/horn_schunck.py)
* [Mean Threshold](computer_vision/mean_threshold.py)
* [Mosaic Augmentation](computer_vision/mosaic_augmentation.py)
* [Pooling Functions](computer_vision/pooling_functions.py)
## Conversions
* [Binary To Decimal](conversions/binary_to_decimal.py)
* [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py)
* [Binary To Octal](conversions/binary_to_octal.py)
* [Decimal To Any](conversions/decimal_to_any.py)
* [Decimal To Binary](conversions/decimal_to_binary.py)
* [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py)
* [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py)
* [Decimal To Octal](conversions/decimal_to_octal.py)
* [Excel Title To Column](conversions/excel_title_to_column.py)
* [Hex To Bin](conversions/hex_to_bin.py)
* [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py)
* [Length Conversion](conversions/length_conversion.py)
* [Molecular Chemistry](conversions/molecular_chemistry.py)
* [Octal To Decimal](conversions/octal_to_decimal.py)
* [Prefix Conversions](conversions/prefix_conversions.py)
* [Prefix Conversions String](conversions/prefix_conversions_string.py)
* [Pressure Conversions](conversions/pressure_conversions.py)
* [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py)
* [Roman Numerals](conversions/roman_numerals.py)
* [Temperature Conversions](conversions/temperature_conversions.py)
* [Volume Conversions](conversions/volume_conversions.py)
* [Weight Conversion](conversions/weight_conversion.py)
## Data Structures
* Binary Tree
* [Avl Tree](data_structures/binary_tree/avl_tree.py)
* [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py)
* [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py)
* [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py)
* [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py)
* [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py)
* [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py)
* [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py)
* [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py)
* [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py)
* [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py)
* [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py)
* [Red Black Tree](data_structures/binary_tree/red_black_tree.py)
* [Segment Tree](data_structures/binary_tree/segment_tree.py)
* [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py)
* [Treap](data_structures/binary_tree/treap.py)
* [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py)
* Disjoint Set
* [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py)
* [Disjoint Set](data_structures/disjoint_set/disjoint_set.py)
* Hashing
* [Double Hash](data_structures/hashing/double_hash.py)
* [Hash Table](data_structures/hashing/hash_table.py)
* [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py)
* Number Theory
* [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py)
* [Quadratic Probing](data_structures/hashing/quadratic_probing.py)
* Heap
* [Binomial Heap](data_structures/heap/binomial_heap.py)
* [Heap](data_structures/heap/heap.py)
* [Heap Generic](data_structures/heap/heap_generic.py)
* [Max Heap](data_structures/heap/max_heap.py)
* [Min Heap](data_structures/heap/min_heap.py)
* [Randomized Heap](data_structures/heap/randomized_heap.py)
* [Skew Heap](data_structures/heap/skew_heap.py)
* Linked List
* [Circular Linked List](data_structures/linked_list/circular_linked_list.py)
* [Deque Doubly](data_structures/linked_list/deque_doubly.py)
* [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py)
* [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py)
* [From Sequence](data_structures/linked_list/from_sequence.py)
* [Has Loop](data_structures/linked_list/has_loop.py)
* [Is Palindrome](data_structures/linked_list/is_palindrome.py)
* [Merge Two Lists](data_structures/linked_list/merge_two_lists.py)
* [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py)
* [Print Reverse](data_structures/linked_list/print_reverse.py)
* [Singly Linked List](data_structures/linked_list/singly_linked_list.py)
* [Skip List](data_structures/linked_list/skip_list.py)
* [Swap Nodes](data_structures/linked_list/swap_nodes.py)
* Queue
* [Circular Queue](data_structures/queue/circular_queue.py)
* [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py)
* [Double Ended Queue](data_structures/queue/double_ended_queue.py)
* [Linked Queue](data_structures/queue/linked_queue.py)
* [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py)
* [Queue On List](data_structures/queue/queue_on_list.py)
* [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py)
* Stacks
* [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py)
* [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py)
* [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py)
* [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py)
* [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py)
* [Next Greater Element](data_structures/stacks/next_greater_element.py)
* [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py)
* [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py)
* [Stack](data_structures/stacks/stack.py)
* [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py)
* [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py)
* [Stock Span Problem](data_structures/stacks/stock_span_problem.py)
* Trie
* [Trie](data_structures/trie/trie.py)
## Digital Image Processing
* [Change Brightness](digital_image_processing/change_brightness.py)
* [Change Contrast](digital_image_processing/change_contrast.py)
* [Convert To Negative](digital_image_processing/convert_to_negative.py)
* Dithering
* [Burkes](digital_image_processing/dithering/burkes.py)
* Edge Detection
* [Canny](digital_image_processing/edge_detection/canny.py)
* Filters
* [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py)
* [Convolve](digital_image_processing/filters/convolve.py)
* [Gabor Filter](digital_image_processing/filters/gabor_filter.py)
* [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py)
* [Median Filter](digital_image_processing/filters/median_filter.py)
* [Sobel Filter](digital_image_processing/filters/sobel_filter.py)
* Histogram Equalization
* [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py)
* [Index Calculation](digital_image_processing/index_calculation.py)
* Morphological Operations
* [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py)
* [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py)
* Resize
* [Resize](digital_image_processing/resize/resize.py)
* Rotation
* [Rotation](digital_image_processing/rotation/rotation.py)
* [Sepia](digital_image_processing/sepia.py)
* [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py)
## Divide And Conquer
* [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py)
* [Convex Hull](divide_and_conquer/convex_hull.py)
* [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py)
* [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py)
* [Inversions](divide_and_conquer/inversions.py)
* [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py)
* [Max Difference Pair](divide_and_conquer/max_difference_pair.py)
* [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py)
* [Mergesort](divide_and_conquer/mergesort.py)
* [Peak](divide_and_conquer/peak.py)
* [Power](divide_and_conquer/power.py)
* [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py)
## Dynamic Programming
* [Abbreviation](dynamic_programming/abbreviation.py)
* [All Construct](dynamic_programming/all_construct.py)
* [Bitmask](dynamic_programming/bitmask.py)
* [Catalan Numbers](dynamic_programming/catalan_numbers.py)
* [Climbing Stairs](dynamic_programming/climbing_stairs.py)
* [Edit Distance](dynamic_programming/edit_distance.py)
* [Factorial](dynamic_programming/factorial.py)
* [Fast Fibonacci](dynamic_programming/fast_fibonacci.py)
* [Fibonacci](dynamic_programming/fibonacci.py)
* [Floyd Warshall](dynamic_programming/floyd_warshall.py)
* [Fractional Knapsack](dynamic_programming/fractional_knapsack.py)
* [Fractional Knapsack 2](dynamic_programming/fractional_knapsack_2.py)
* [Integer Partition](dynamic_programming/integer_partition.py)
* [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py)
* [Knapsack](dynamic_programming/knapsack.py)
* [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py)
* [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py)
* [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py)
* [Longest Sub Array](dynamic_programming/longest_sub_array.py)
* [Matrix Chain Order](dynamic_programming/matrix_chain_order.py)
* [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py)
* [Max Sub Array](dynamic_programming/max_sub_array.py)
* [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py)
* [Minimum Coin Change](dynamic_programming/minimum_coin_change.py)
* [Minimum Cost Path](dynamic_programming/minimum_cost_path.py)
* [Minimum Partition](dynamic_programming/minimum_partition.py)
* [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py)
* [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py)
* [Rod Cutting](dynamic_programming/rod_cutting.py)
* [Subset Generation](dynamic_programming/subset_generation.py)
* [Sum Of Subset](dynamic_programming/sum_of_subset.py)
## Electronics
* [Carrier Concentration](electronics/carrier_concentration.py)
* [Coulombs Law](electronics/coulombs_law.py)
* [Electric Power](electronics/electric_power.py)
* [Ohms Law](electronics/ohms_law.py)
## File Transfer
* [Receive File](file_transfer/receive_file.py)
* [Send File](file_transfer/send_file.py)
* Tests
* [Test Send File](file_transfer/tests/test_send_file.py)
## Financial
* [Equated Monthly Installments](financial/equated_monthly_installments.py)
* [Interest](financial/interest.py)
## Fractals
* [Julia Sets](fractals/julia_sets.py)
* [Koch Snowflake](fractals/koch_snowflake.py)
* [Mandelbrot](fractals/mandelbrot.py)
* [Sierpinski Triangle](fractals/sierpinski_triangle.py)
## Fuzzy Logic
* [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py)
## Genetic Algorithm
* [Basic String](genetic_algorithm/basic_string.py)
## Geodesy
* [Haversine Distance](geodesy/haversine_distance.py)
* [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py)
## Graphics
* [Bezier Curve](graphics/bezier_curve.py)
* [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py)
## Graphs
* [A Star](graphs/a_star.py)
* [Articulation Points](graphs/articulation_points.py)
* [Basic Graphs](graphs/basic_graphs.py)
* [Bellman Ford](graphs/bellman_ford.py)
* [Bfs Shortest Path](graphs/bfs_shortest_path.py)
* [Bfs Zero One Shortest Path](graphs/bfs_zero_one_shortest_path.py)
* [Bidirectional A Star](graphs/bidirectional_a_star.py)
* [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py)
* [Boruvka](graphs/boruvka.py)
* [Breadth First Search](graphs/breadth_first_search.py)
* [Breadth First Search 2](graphs/breadth_first_search_2.py)
* [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py)
* [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py)
* [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py)
* [Check Cycle](graphs/check_cycle.py)
* [Connected Components](graphs/connected_components.py)
* [Depth First Search](graphs/depth_first_search.py)
* [Depth First Search 2](graphs/depth_first_search_2.py)
* [Dijkstra](graphs/dijkstra.py)
* [Dijkstra 2](graphs/dijkstra_2.py)
* [Dijkstra Algorithm](graphs/dijkstra_algorithm.py)
* [Dinic](graphs/dinic.py)
* [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py)
* [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py)
* [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py)
* [Even Tree](graphs/even_tree.py)
* [Finding Bridges](graphs/finding_bridges.py)
* [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py)
* [G Topological Sort](graphs/g_topological_sort.py)
* [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py)
* [Graph List](graphs/graph_list.py)
* [Graph Matrix](graphs/graph_matrix.py)
* [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py)
* [Greedy Best First](graphs/greedy_best_first.py)
* [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py)
* [Kahns Algorithm Long](graphs/kahns_algorithm_long.py)
* [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py)
* [Karger](graphs/karger.py)
* [Markov Chain](graphs/markov_chain.py)
* [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py)
* [Minimum Path Sum](graphs/minimum_path_sum.py)
* [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py)
* [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py)
* [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py)
* [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py)
* [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py)
* [Multi Heuristic Astar](graphs/multi_heuristic_astar.py)
* [Page Rank](graphs/page_rank.py)
* [Prim](graphs/prim.py)
* [Random Graph Generator](graphs/random_graph_generator.py)
* [Scc Kosaraju](graphs/scc_kosaraju.py)
* [Strongly Connected Components](graphs/strongly_connected_components.py)
* [Tarjans Scc](graphs/tarjans_scc.py)
* Tests
* [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py)
* [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py)
## Greedy Methods
* [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py)
## Hashes
* [Adler32](hashes/adler32.py)
* [Chaos Machine](hashes/chaos_machine.py)
* [Djb2](hashes/djb2.py)
* [Enigma Machine](hashes/enigma_machine.py)
* [Hamming Code](hashes/hamming_code.py)
* [Luhn](hashes/luhn.py)
* [Md5](hashes/md5.py)
* [Sdbm](hashes/sdbm.py)
* [Sha1](hashes/sha1.py)
* [Sha256](hashes/sha256.py)
## Knapsack
* [Greedy Knapsack](knapsack/greedy_knapsack.py)
* [Knapsack](knapsack/knapsack.py)
* Tests
* [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py)
* [Test Knapsack](knapsack/tests/test_knapsack.py)
## Linear Algebra
* Src
* [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py)
* [Lib](linear_algebra/src/lib.py)
* [Polynom For Points](linear_algebra/src/polynom_for_points.py)
* [Power Iteration](linear_algebra/src/power_iteration.py)
* [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py)
* [Schur Complement](linear_algebra/src/schur_complement.py)
* [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py)
* [Transformations 2D](linear_algebra/src/transformations_2d.py)
## Machine Learning
* [Astar](machine_learning/astar.py)
* [Data Transformations](machine_learning/data_transformations.py)
* [Decision Tree](machine_learning/decision_tree.py)
* Forecasting
* [Run](machine_learning/forecasting/run.py)
* [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py)
* [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py)
* [Gradient Descent](machine_learning/gradient_descent.py)
* [K Means Clust](machine_learning/k_means_clust.py)
* [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py)
* [Knn Sklearn](machine_learning/knn_sklearn.py)
* [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py)
* [Linear Regression](machine_learning/linear_regression.py)
* Local Weighted Learning
* [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py)
* [Logistic Regression](machine_learning/logistic_regression.py)
* Lstm
* [Lstm Prediction](machine_learning/lstm/lstm_prediction.py)
* [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py)
* [Polymonial Regression](machine_learning/polymonial_regression.py)
* [Random Forest Classifier](machine_learning/random_forest_classifier.py)
* [Random Forest Regressor](machine_learning/random_forest_regressor.py)
* [Scoring Functions](machine_learning/scoring_functions.py)
* [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py)
* [Similarity Search](machine_learning/similarity_search.py)
* [Word Frequency Functions](machine_learning/word_frequency_functions.py)
## Maths
* [3N Plus 1](maths/3n_plus_1.py)
* [Abs](maths/abs.py)
* [Abs Max](maths/abs_max.py)
* [Abs Min](maths/abs_min.py)
* [Add](maths/add.py)
* [Aliquot Sum](maths/aliquot_sum.py)
* [Allocation Number](maths/allocation_number.py)
* [Area](maths/area.py)
* [Area Under Curve](maths/area_under_curve.py)
* [Armstrong Numbers](maths/armstrong_numbers.py)
* [Average Absolute Deviation](maths/average_absolute_deviation.py)
* [Average Mean](maths/average_mean.py)
* [Average Median](maths/average_median.py)
* [Average Mode](maths/average_mode.py)
* [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py)
* [Basic Maths](maths/basic_maths.py)
* [Binary Exp Mod](maths/binary_exp_mod.py)
* [Binary Exponentiation](maths/binary_exponentiation.py)
* [Binary Exponentiation 2](maths/binary_exponentiation_2.py)
* [Binary Exponentiation 3](maths/binary_exponentiation_3.py)
* [Binomial Coefficient](maths/binomial_coefficient.py)
* [Binomial Distribution](maths/binomial_distribution.py)
* [Bisection](maths/bisection.py)
* [Ceil](maths/ceil.py)
* [Check Polygon](maths/check_polygon.py)
* [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py)
* [Collatz Sequence](maths/collatz_sequence.py)
* [Combinations](maths/combinations.py)
* [Decimal Isolate](maths/decimal_isolate.py)
* [Double Factorial Iterative](maths/double_factorial_iterative.py)
* [Double Factorial Recursive](maths/double_factorial_recursive.py)
* [Entropy](maths/entropy.py)
* [Euclidean Distance](maths/euclidean_distance.py)
* [Euclidean Gcd](maths/euclidean_gcd.py)
* [Euler Method](maths/euler_method.py)
* [Euler Modified](maths/euler_modified.py)
* [Eulers Totient](maths/eulers_totient.py)
* [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py)
* [Factorial Iterative](maths/factorial_iterative.py)
* [Factorial Recursive](maths/factorial_recursive.py)
* [Factors](maths/factors.py)
* [Fermat Little Theorem](maths/fermat_little_theorem.py)
* [Fibonacci](maths/fibonacci.py)
* [Find Max](maths/find_max.py)
* [Find Max Recursion](maths/find_max_recursion.py)
* [Find Min](maths/find_min.py)
* [Find Min Recursion](maths/find_min_recursion.py)
* [Floor](maths/floor.py)
* [Gamma](maths/gamma.py)
* [Gamma Recursive](maths/gamma_recursive.py)
* [Gaussian](maths/gaussian.py)
* [Greatest Common Divisor](maths/greatest_common_divisor.py)
* [Greedy Coin Change](maths/greedy_coin_change.py)
* [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py)
* [Integration By Simpson Approx](maths/integration_by_simpson_approx.py)
* [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py)
* [Is Square Free](maths/is_square_free.py)
* [Jaccard Similarity](maths/jaccard_similarity.py)
* [Kadanes](maths/kadanes.py)
* [Karatsuba](maths/karatsuba.py)
* [Krishnamurthy Number](maths/krishnamurthy_number.py)
* [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py)
* [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py)
* [Largest Subarray Sum](maths/largest_subarray_sum.py)
* [Least Common Multiple](maths/least_common_multiple.py)
* [Line Length](maths/line_length.py)
* [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py)
* [Lucas Series](maths/lucas_series.py)
* [Matrix Exponentiation](maths/matrix_exponentiation.py)
* [Max Sum Sliding Window](maths/max_sum_sliding_window.py)
* [Median Of Two Arrays](maths/median_of_two_arrays.py)
* [Miller Rabin](maths/miller_rabin.py)
* [Mobius Function](maths/mobius_function.py)
* [Modular Exponential](maths/modular_exponential.py)
* [Monte Carlo](maths/monte_carlo.py)
* [Monte Carlo Dice](maths/monte_carlo_dice.py)
* [Nevilles Method](maths/nevilles_method.py)
* [Newton Raphson](maths/newton_raphson.py)
* [Number Of Digits](maths/number_of_digits.py)
* [Numerical Integration](maths/numerical_integration.py)
* [Perfect Cube](maths/perfect_cube.py)
* [Perfect Number](maths/perfect_number.py)
* [Perfect Square](maths/perfect_square.py)
* [Persistence](maths/persistence.py)
* [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py)
* [Points Are Collinear 3D](maths/points_are_collinear_3d.py)
* [Pollard Rho](maths/pollard_rho.py)
* [Polynomial Evaluation](maths/polynomial_evaluation.py)
* [Power Using Recursion](maths/power_using_recursion.py)
* [Prime Check](maths/prime_check.py)
* [Prime Factors](maths/prime_factors.py)
* [Prime Numbers](maths/prime_numbers.py)
* [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py)
* [Primelib](maths/primelib.py)
* [Proth Number](maths/proth_number.py)
* [Pythagoras](maths/pythagoras.py)
* [Qr Decomposition](maths/qr_decomposition.py)
* [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py)
* [Radians](maths/radians.py)
* [Radix2 Fft](maths/radix2_fft.py)
* [Relu](maths/relu.py)
* [Runge Kutta](maths/runge_kutta.py)
* [Segmented Sieve](maths/segmented_sieve.py)
* Series
* [Arithmetic](maths/series/arithmetic.py)
* [Geometric](maths/series/geometric.py)
* [Geometric Series](maths/series/geometric_series.py)
* [Harmonic](maths/series/harmonic.py)
* [Harmonic Series](maths/series/harmonic_series.py)
* [Hexagonal Numbers](maths/series/hexagonal_numbers.py)
* [P Series](maths/series/p_series.py)
* [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py)
* [Sigmoid](maths/sigmoid.py)
* [Simpson Rule](maths/simpson_rule.py)
* [Sin](maths/sin.py)
* [Sock Merchant](maths/sock_merchant.py)
* [Softmax](maths/softmax.py)
* [Square Root](maths/square_root.py)
* [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py)
* [Sum Of Digits](maths/sum_of_digits.py)
* [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py)
* [Sylvester Sequence](maths/sylvester_sequence.py)
* [Test Prime Check](maths/test_prime_check.py)
* [Trapezoidal Rule](maths/trapezoidal_rule.py)
* [Triplet Sum](maths/triplet_sum.py)
* [Two Pointer](maths/two_pointer.py)
* [Two Sum](maths/two_sum.py)
* [Ugly Numbers](maths/ugly_numbers.py)
* [Volume](maths/volume.py)
* [Zellers Congruence](maths/zellers_congruence.py)
## Matrix
* [Count Islands In Matrix](matrix/count_islands_in_matrix.py)
* [Inverse Of Matrix](matrix/inverse_of_matrix.py)
* [Matrix Class](matrix/matrix_class.py)
* [Matrix Operation](matrix/matrix_operation.py)
* [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py)
* [Rotate Matrix](matrix/rotate_matrix.py)
* [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py)
* [Sherman Morrison](matrix/sherman_morrison.py)
* [Spiral Print](matrix/spiral_print.py)
* Tests
* [Test Matrix Operation](matrix/tests/test_matrix_operation.py)
## Networking Flow
* [Ford Fulkerson](networking_flow/ford_fulkerson.py)
* [Minimum Cut](networking_flow/minimum_cut.py)
## Neural Network
* [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py)
* [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py)
* [Convolution Neural Network](neural_network/convolution_neural_network.py)
* [Perceptron](neural_network/perceptron.py)
## Other
* [Activity Selection](other/activity_selection.py)
* [Alternative List Arrange](other/alternative_list_arrange.py)
* [Check Strong Password](other/check_strong_password.py)
* [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py)
* [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py)
* [Doomsday](other/doomsday.py)
* [Fischer Yates Shuffle](other/fischer_yates_shuffle.py)
* [Gauss Easter](other/gauss_easter.py)
* [Graham Scan](other/graham_scan.py)
* [Greedy](other/greedy.py)
* [Least Recently Used](other/least_recently_used.py)
* [Lfu Cache](other/lfu_cache.py)
* [Linear Congruential Generator](other/linear_congruential_generator.py)
* [Lru Cache](other/lru_cache.py)
* [Magicdiamondpattern](other/magicdiamondpattern.py)
* [Nested Brackets](other/nested_brackets.py)
* [Password Generator](other/password_generator.py)
* [Scoring Algorithm](other/scoring_algorithm.py)
* [Sdes](other/sdes.py)
* [Tower Of Hanoi](other/tower_of_hanoi.py)
## Physics
* [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py)
* [Lorenz Transformation Four Vector](physics/lorenz_transformation_four_vector.py)
* [N Body Simulation](physics/n_body_simulation.py)
* [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py)
## Project Euler
* Problem 001
* [Sol1](project_euler/problem_001/sol1.py)
* [Sol2](project_euler/problem_001/sol2.py)
* [Sol3](project_euler/problem_001/sol3.py)
* [Sol4](project_euler/problem_001/sol4.py)
* [Sol5](project_euler/problem_001/sol5.py)
* [Sol6](project_euler/problem_001/sol6.py)
* [Sol7](project_euler/problem_001/sol7.py)
* Problem 002
* [Sol1](project_euler/problem_002/sol1.py)
* [Sol2](project_euler/problem_002/sol2.py)
* [Sol3](project_euler/problem_002/sol3.py)
* [Sol4](project_euler/problem_002/sol4.py)
* [Sol5](project_euler/problem_002/sol5.py)
* Problem 003
* [Sol1](project_euler/problem_003/sol1.py)
* [Sol2](project_euler/problem_003/sol2.py)
* [Sol3](project_euler/problem_003/sol3.py)
* Problem 004
* [Sol1](project_euler/problem_004/sol1.py)
* [Sol2](project_euler/problem_004/sol2.py)
* Problem 005
* [Sol1](project_euler/problem_005/sol1.py)
* [Sol2](project_euler/problem_005/sol2.py)
* Problem 006
* [Sol1](project_euler/problem_006/sol1.py)
* [Sol2](project_euler/problem_006/sol2.py)
* [Sol3](project_euler/problem_006/sol3.py)
* [Sol4](project_euler/problem_006/sol4.py)
* Problem 007
* [Sol1](project_euler/problem_007/sol1.py)
* [Sol2](project_euler/problem_007/sol2.py)
* [Sol3](project_euler/problem_007/sol3.py)
* Problem 008
* [Sol1](project_euler/problem_008/sol1.py)
* [Sol2](project_euler/problem_008/sol2.py)
* [Sol3](project_euler/problem_008/sol3.py)
* Problem 009
* [Sol1](project_euler/problem_009/sol1.py)
* [Sol2](project_euler/problem_009/sol2.py)
* [Sol3](project_euler/problem_009/sol3.py)
* Problem 010
* [Sol1](project_euler/problem_010/sol1.py)
* [Sol2](project_euler/problem_010/sol2.py)
* [Sol3](project_euler/problem_010/sol3.py)
* Problem 011
* [Sol1](project_euler/problem_011/sol1.py)
* [Sol2](project_euler/problem_011/sol2.py)
* Problem 012
* [Sol1](project_euler/problem_012/sol1.py)
* [Sol2](project_euler/problem_012/sol2.py)
* Problem 013
* [Sol1](project_euler/problem_013/sol1.py)
* Problem 014
* [Sol1](project_euler/problem_014/sol1.py)
* [Sol2](project_euler/problem_014/sol2.py)
* Problem 015
* [Sol1](project_euler/problem_015/sol1.py)
* Problem 016
* [Sol1](project_euler/problem_016/sol1.py)
* [Sol2](project_euler/problem_016/sol2.py)
* Problem 017
* [Sol1](project_euler/problem_017/sol1.py)
* Problem 018
* [Solution](project_euler/problem_018/solution.py)
* Problem 019
* [Sol1](project_euler/problem_019/sol1.py)
* Problem 020
* [Sol1](project_euler/problem_020/sol1.py)
* [Sol2](project_euler/problem_020/sol2.py)
* [Sol3](project_euler/problem_020/sol3.py)
* [Sol4](project_euler/problem_020/sol4.py)
* Problem 021
* [Sol1](project_euler/problem_021/sol1.py)
* Problem 022
* [Sol1](project_euler/problem_022/sol1.py)
* [Sol2](project_euler/problem_022/sol2.py)
* Problem 023
* [Sol1](project_euler/problem_023/sol1.py)
* Problem 024
* [Sol1](project_euler/problem_024/sol1.py)
* Problem 025
* [Sol1](project_euler/problem_025/sol1.py)
* [Sol2](project_euler/problem_025/sol2.py)
* [Sol3](project_euler/problem_025/sol3.py)
* Problem 026
* [Sol1](project_euler/problem_026/sol1.py)
* Problem 027
* [Sol1](project_euler/problem_027/sol1.py)
* Problem 028
* [Sol1](project_euler/problem_028/sol1.py)
* Problem 029
* [Sol1](project_euler/problem_029/sol1.py)
* Problem 030
* [Sol1](project_euler/problem_030/sol1.py)
* Problem 031
* [Sol1](project_euler/problem_031/sol1.py)
* [Sol2](project_euler/problem_031/sol2.py)
* Problem 032
* [Sol32](project_euler/problem_032/sol32.py)
* Problem 033
* [Sol1](project_euler/problem_033/sol1.py)
* Problem 034
* [Sol1](project_euler/problem_034/sol1.py)
* Problem 035
* [Sol1](project_euler/problem_035/sol1.py)
* Problem 036
* [Sol1](project_euler/problem_036/sol1.py)
* Problem 037
* [Sol1](project_euler/problem_037/sol1.py)
* Problem 038
* [Sol1](project_euler/problem_038/sol1.py)
* Problem 039
* [Sol1](project_euler/problem_039/sol1.py)
* Problem 040
* [Sol1](project_euler/problem_040/sol1.py)
* Problem 041
* [Sol1](project_euler/problem_041/sol1.py)
* Problem 042
* [Solution42](project_euler/problem_042/solution42.py)
* Problem 043
* [Sol1](project_euler/problem_043/sol1.py)
* Problem 044
* [Sol1](project_euler/problem_044/sol1.py)
* Problem 045
* [Sol1](project_euler/problem_045/sol1.py)
* Problem 046
* [Sol1](project_euler/problem_046/sol1.py)
* Problem 047
* [Sol1](project_euler/problem_047/sol1.py)
* Problem 048
* [Sol1](project_euler/problem_048/sol1.py)
* Problem 049
* [Sol1](project_euler/problem_049/sol1.py)
* Problem 050
* [Sol1](project_euler/problem_050/sol1.py)
* Problem 051
* [Sol1](project_euler/problem_051/sol1.py)
* Problem 052
* [Sol1](project_euler/problem_052/sol1.py)
* Problem 053
* [Sol1](project_euler/problem_053/sol1.py)
* Problem 054
* [Sol1](project_euler/problem_054/sol1.py)
* [Test Poker Hand](project_euler/problem_054/test_poker_hand.py)
* Problem 055
* [Sol1](project_euler/problem_055/sol1.py)
* Problem 056
* [Sol1](project_euler/problem_056/sol1.py)
* Problem 057
* [Sol1](project_euler/problem_057/sol1.py)
* Problem 058
* [Sol1](project_euler/problem_058/sol1.py)
* Problem 059
* [Sol1](project_euler/problem_059/sol1.py)
* Problem 062
* [Sol1](project_euler/problem_062/sol1.py)
* Problem 063
* [Sol1](project_euler/problem_063/sol1.py)
* Problem 064
* [Sol1](project_euler/problem_064/sol1.py)
* Problem 065
* [Sol1](project_euler/problem_065/sol1.py)
* Problem 067
* [Sol1](project_euler/problem_067/sol1.py)
* [Sol2](project_euler/problem_067/sol2.py)
* Problem 068
* [Sol1](project_euler/problem_068/sol1.py)
* Problem 069
* [Sol1](project_euler/problem_069/sol1.py)
* Problem 070
* [Sol1](project_euler/problem_070/sol1.py)
* Problem 071
* [Sol1](project_euler/problem_071/sol1.py)
* Problem 072
* [Sol1](project_euler/problem_072/sol1.py)
* [Sol2](project_euler/problem_072/sol2.py)
* Problem 074
* [Sol1](project_euler/problem_074/sol1.py)
* [Sol2](project_euler/problem_074/sol2.py)
* Problem 075
* [Sol1](project_euler/problem_075/sol1.py)
* Problem 076
* [Sol1](project_euler/problem_076/sol1.py)
* Problem 077
* [Sol1](project_euler/problem_077/sol1.py)
* Problem 078
* [Sol1](project_euler/problem_078/sol1.py)
* Problem 080
* [Sol1](project_euler/problem_080/sol1.py)
* Problem 081
* [Sol1](project_euler/problem_081/sol1.py)
* Problem 085
* [Sol1](project_euler/problem_085/sol1.py)
* Problem 086
* [Sol1](project_euler/problem_086/sol1.py)
* Problem 087
* [Sol1](project_euler/problem_087/sol1.py)
* Problem 089
* [Sol1](project_euler/problem_089/sol1.py)
* Problem 091
* [Sol1](project_euler/problem_091/sol1.py)
* Problem 092
* [Sol1](project_euler/problem_092/sol1.py)
* Problem 097
* [Sol1](project_euler/problem_097/sol1.py)
* Problem 099
* [Sol1](project_euler/problem_099/sol1.py)
* Problem 101
* [Sol1](project_euler/problem_101/sol1.py)
* Problem 102
* [Sol1](project_euler/problem_102/sol1.py)
* Problem 104
* [Sol](project_euler/problem_104/sol.py)
* Problem 107
* [Sol1](project_euler/problem_107/sol1.py)
* Problem 109
* [Sol1](project_euler/problem_109/sol1.py)
* Problem 112
* [Sol1](project_euler/problem_112/sol1.py)
* Problem 113
* [Sol1](project_euler/problem_113/sol1.py)
* Problem 119
* [Sol1](project_euler/problem_119/sol1.py)
* Problem 120
* [Sol1](project_euler/problem_120/sol1.py)
* Problem 121
* [Sol1](project_euler/problem_121/sol1.py)
* Problem 123
* [Sol1](project_euler/problem_123/sol1.py)
* Problem 125
* [Sol1](project_euler/problem_125/sol1.py)
* Problem 129
* [Sol1](project_euler/problem_129/sol1.py)
* Problem 135
* [Sol1](project_euler/problem_135/sol1.py)
* Problem 144
* [Sol1](project_euler/problem_144/sol1.py)
* Problem 145
* [Sol1](project_euler/problem_145/sol1.py)
* Problem 173
* [Sol1](project_euler/problem_173/sol1.py)
* Problem 174
* [Sol1](project_euler/problem_174/sol1.py)
* Problem 180
* [Sol1](project_euler/problem_180/sol1.py)
* Problem 188
* [Sol1](project_euler/problem_188/sol1.py)
* Problem 191
* [Sol1](project_euler/problem_191/sol1.py)
* Problem 203
* [Sol1](project_euler/problem_203/sol1.py)
* Problem 205
* [Sol1](project_euler/problem_205/sol1.py)
* Problem 206
* [Sol1](project_euler/problem_206/sol1.py)
* Problem 207
* [Sol1](project_euler/problem_207/sol1.py)
* Problem 234
* [Sol1](project_euler/problem_234/sol1.py)
* Problem 301
* [Sol1](project_euler/problem_301/sol1.py)
* Problem 493
* [Sol1](project_euler/problem_493/sol1.py)
* Problem 551
* [Sol1](project_euler/problem_551/sol1.py)
* Problem 686
* [Sol1](project_euler/problem_686/sol1.py)
## Quantum
* [Deutsch Jozsa](quantum/deutsch_jozsa.py)
* [Half Adder](quantum/half_adder.py)
* [Not Gate](quantum/not_gate.py)
* [Quantum Entanglement](quantum/quantum_entanglement.py)
* [Ripple Adder Classic](quantum/ripple_adder_classic.py)
* [Single Qubit Measure](quantum/single_qubit_measure.py)
## Scheduling
* [First Come First Served](scheduling/first_come_first_served.py)
* [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py)
* [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py)
* [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py)
* [Round Robin](scheduling/round_robin.py)
* [Shortest Job First](scheduling/shortest_job_first.py)
## Searches
* [Binary Search](searches/binary_search.py)
* [Binary Tree Traversal](searches/binary_tree_traversal.py)
* [Double Linear Search](searches/double_linear_search.py)
* [Double Linear Search Recursion](searches/double_linear_search_recursion.py)
* [Fibonacci Search](searches/fibonacci_search.py)
* [Hill Climbing](searches/hill_climbing.py)
* [Interpolation Search](searches/interpolation_search.py)
* [Jump Search](searches/jump_search.py)
* [Linear Search](searches/linear_search.py)
* [Quick Select](searches/quick_select.py)
* [Sentinel Linear Search](searches/sentinel_linear_search.py)
* [Simple Binary Search](searches/simple_binary_search.py)
* [Simulated Annealing](searches/simulated_annealing.py)
* [Tabu Search](searches/tabu_search.py)
* [Ternary Search](searches/ternary_search.py)
## Sorts
* [Bead Sort](sorts/bead_sort.py)
* [Bitonic Sort](sorts/bitonic_sort.py)
* [Bogo Sort](sorts/bogo_sort.py)
* [Bubble Sort](sorts/bubble_sort.py)
* [Bucket Sort](sorts/bucket_sort.py)
* [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py)
* [Comb Sort](sorts/comb_sort.py)
* [Counting Sort](sorts/counting_sort.py)
* [Cycle Sort](sorts/cycle_sort.py)
* [Double Sort](sorts/double_sort.py)
* [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py)
* [Exchange Sort](sorts/exchange_sort.py)
* [External Sort](sorts/external_sort.py)
* [Gnome Sort](sorts/gnome_sort.py)
* [Heap Sort](sorts/heap_sort.py)
* [Insertion Sort](sorts/insertion_sort.py)
* [Intro Sort](sorts/intro_sort.py)
* [Iterative Merge Sort](sorts/iterative_merge_sort.py)
* [Merge Insertion Sort](sorts/merge_insertion_sort.py)
* [Merge Sort](sorts/merge_sort.py)
* [Msd Radix Sort](sorts/msd_radix_sort.py)
* [Natural Sort](sorts/natural_sort.py)
* [Odd Even Sort](sorts/odd_even_sort.py)
* [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py)
* [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py)
* [Pancake Sort](sorts/pancake_sort.py)
* [Patience Sort](sorts/patience_sort.py)
* [Pigeon Sort](sorts/pigeon_sort.py)
* [Pigeonhole Sort](sorts/pigeonhole_sort.py)
* [Quick Sort](sorts/quick_sort.py)
* [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py)
* [Radix Sort](sorts/radix_sort.py)
* [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py)
* [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py)
* [Recursive Bubble Sort](sorts/recursive_bubble_sort.py)
* [Recursive Insertion Sort](sorts/recursive_insertion_sort.py)
* [Recursive Mergesort Array](sorts/recursive_mergesort_array.py)
* [Recursive Quick Sort](sorts/recursive_quick_sort.py)
* [Selection Sort](sorts/selection_sort.py)
* [Shell Sort](sorts/shell_sort.py)
* [Slowsort](sorts/slowsort.py)
* [Stooge Sort](sorts/stooge_sort.py)
* [Strand Sort](sorts/strand_sort.py)
* [Tim Sort](sorts/tim_sort.py)
* [Topological Sort](sorts/topological_sort.py)
* [Tree Sort](sorts/tree_sort.py)
* [Unknown Sort](sorts/unknown_sort.py)
* [Wiggle Sort](sorts/wiggle_sort.py)
## Strings
* [Aho Corasick](strings/aho_corasick.py)
* [Alternative String Arrange](strings/alternative_string_arrange.py)
* [Anagrams](strings/anagrams.py)
* [Autocomplete Using Trie](strings/autocomplete_using_trie.py)
* [Boyer Moore Search](strings/boyer_moore_search.py)
* [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py)
* [Capitalize](strings/capitalize.py)
* [Check Anagrams](strings/check_anagrams.py)
* [Check Pangram](strings/check_pangram.py)
* [Credit Card Validator](strings/credit_card_validator.py)
* [Detecting English Programmatically](strings/detecting_english_programmatically.py)
* [Frequency Finder](strings/frequency_finder.py)
* [Hamming Distance](strings/hamming_distance.py)
* [Indian Phone Validator](strings/indian_phone_validator.py)
* [Is Contains Unique Chars](strings/is_contains_unique_chars.py)
* [Is Palindrome](strings/is_palindrome.py)
* [Jaro Winkler](strings/jaro_winkler.py)
* [Join](strings/join.py)
* [Knuth Morris Pratt](strings/knuth_morris_pratt.py)
* [Levenshtein Distance](strings/levenshtein_distance.py)
* [Lower](strings/lower.py)
* [Manacher](strings/manacher.py)
* [Min Cost String Conversion](strings/min_cost_string_conversion.py)
* [Naive String Search](strings/naive_string_search.py)
* [Ngram](strings/ngram.py)
* [Palindrome](strings/palindrome.py)
* [Prefix Function](strings/prefix_function.py)
* [Rabin Karp](strings/rabin_karp.py)
* [Remove Duplicate](strings/remove_duplicate.py)
* [Reverse Letters](strings/reverse_letters.py)
* [Reverse Long Words](strings/reverse_long_words.py)
* [Reverse Words](strings/reverse_words.py)
* [Split](strings/split.py)
* [Upper](strings/upper.py)
* [Wave](strings/wave.py)
* [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py)
* [Word Occurrence](strings/word_occurrence.py)
* [Word Patterns](strings/word_patterns.py)
* [Z Function](strings/z_function.py)
## Web Programming
* [Co2 Emission](web_programming/co2_emission.py)
* [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py)
* [Crawl Google Results](web_programming/crawl_google_results.py)
* [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py)
* [Currency Converter](web_programming/currency_converter.py)
* [Current Stock Price](web_programming/current_stock_price.py)
* [Current Weather](web_programming/current_weather.py)
* [Daily Horoscope](web_programming/daily_horoscope.py)
* [Download Images From Google Query](web_programming/download_images_from_google_query.py)
* [Emails From Url](web_programming/emails_from_url.py)
* [Fetch Anime And Play](web_programming/fetch_anime_and_play.py)
* [Fetch Bbc News](web_programming/fetch_bbc_news.py)
* [Fetch Github Info](web_programming/fetch_github_info.py)
* [Fetch Jobs](web_programming/fetch_jobs.py)
* [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py)
* [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py)
* [Get Imdbtop](web_programming/get_imdbtop.py)
* [Get Top Hn Posts](web_programming/get_top_hn_posts.py)
* [Get User Tweets](web_programming/get_user_tweets.py)
* [Giphy](web_programming/giphy.py)
* [Instagram Crawler](web_programming/instagram_crawler.py)
* [Instagram Pic](web_programming/instagram_pic.py)
* [Instagram Video](web_programming/instagram_video.py)
* [Nasa Data](web_programming/nasa_data.py)
* [Random Anime Character](web_programming/random_anime_character.py)
* [Recaptcha Verification](web_programming/recaptcha_verification.py)
* [Reddit](web_programming/reddit.py)
* [Search Books By Isbn](web_programming/search_books_by_isbn.py)
* [Slack Message](web_programming/slack_message.py)
* [Test Fetch Github Info](web_programming/test_fetch_github_info.py)
* [World Covid19 Stats](web_programming/world_covid19_stats.py)
|
## Arithmetic Analysis
* [Bisection](arithmetic_analysis/bisection.py)
* [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py)
* [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py)
* [Intersection](arithmetic_analysis/intersection.py)
* [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py)
* [Lu Decomposition](arithmetic_analysis/lu_decomposition.py)
* [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py)
* [Newton Method](arithmetic_analysis/newton_method.py)
* [Newton Raphson](arithmetic_analysis/newton_raphson.py)
* [Secant Method](arithmetic_analysis/secant_method.py)
## Audio Filters
* [Butterworth Filter](audio_filters/butterworth_filter.py)
* [Iir Filter](audio_filters/iir_filter.py)
* [Show Response](audio_filters/show_response.py)
## Backtracking
* [All Combinations](backtracking/all_combinations.py)
* [All Permutations](backtracking/all_permutations.py)
* [All Subsequences](backtracking/all_subsequences.py)
* [Coloring](backtracking/coloring.py)
* [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py)
* [Knight Tour](backtracking/knight_tour.py)
* [Minimax](backtracking/minimax.py)
* [N Queens](backtracking/n_queens.py)
* [N Queens Math](backtracking/n_queens_math.py)
* [Rat In Maze](backtracking/rat_in_maze.py)
* [Sudoku](backtracking/sudoku.py)
* [Sum Of Subsets](backtracking/sum_of_subsets.py)
## Bit Manipulation
* [Binary And Operator](bit_manipulation/binary_and_operator.py)
* [Binary Count Setbits](bit_manipulation/binary_count_setbits.py)
* [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py)
* [Binary Or Operator](bit_manipulation/binary_or_operator.py)
* [Binary Shifts](bit_manipulation/binary_shifts.py)
* [Binary Twos Complement](bit_manipulation/binary_twos_complement.py)
* [Binary Xor Operator](bit_manipulation/binary_xor_operator.py)
* [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py)
* [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py)
* [Gray Code Sequence](bit_manipulation/gray_code_sequence.py)
* [Reverse Bits](bit_manipulation/reverse_bits.py)
* [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py)
## Blockchain
* [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py)
* [Diophantine Equation](blockchain/diophantine_equation.py)
* [Modular Division](blockchain/modular_division.py)
## Boolean Algebra
* [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py)
## Cellular Automata
* [Conways Game Of Life](cellular_automata/conways_game_of_life.py)
* [Game Of Life](cellular_automata/game_of_life.py)
* [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py)
* [One Dimensional](cellular_automata/one_dimensional.py)
## Ciphers
* [A1Z26](ciphers/a1z26.py)
* [Affine Cipher](ciphers/affine_cipher.py)
* [Atbash](ciphers/atbash.py)
* [Baconian Cipher](ciphers/baconian_cipher.py)
* [Base16](ciphers/base16.py)
* [Base32](ciphers/base32.py)
* [Base64](ciphers/base64.py)
* [Base85](ciphers/base85.py)
* [Beaufort Cipher](ciphers/beaufort_cipher.py)
* [Bifid](ciphers/bifid.py)
* [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py)
* [Caesar Cipher](ciphers/caesar_cipher.py)
* [Cryptomath Module](ciphers/cryptomath_module.py)
* [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py)
* [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py)
* [Diffie](ciphers/diffie.py)
* [Diffie Hellman](ciphers/diffie_hellman.py)
* [Elgamal Key Generator](ciphers/elgamal_key_generator.py)
* [Enigma Machine2](ciphers/enigma_machine2.py)
* [Hill Cipher](ciphers/hill_cipher.py)
* [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py)
* [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py)
* [Morse Code](ciphers/morse_code.py)
* [Onepad Cipher](ciphers/onepad_cipher.py)
* [Playfair Cipher](ciphers/playfair_cipher.py)
* [Polybius](ciphers/polybius.py)
* [Porta Cipher](ciphers/porta_cipher.py)
* [Rabin Miller](ciphers/rabin_miller.py)
* [Rail Fence Cipher](ciphers/rail_fence_cipher.py)
* [Rot13](ciphers/rot13.py)
* [Rsa Cipher](ciphers/rsa_cipher.py)
* [Rsa Factorization](ciphers/rsa_factorization.py)
* [Rsa Key Generator](ciphers/rsa_key_generator.py)
* [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py)
* [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py)
* [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py)
* [Trafid Cipher](ciphers/trafid_cipher.py)
* [Transposition Cipher](ciphers/transposition_cipher.py)
* [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py)
* [Vigenere Cipher](ciphers/vigenere_cipher.py)
* [Xor Cipher](ciphers/xor_cipher.py)
## Compression
* [Burrows Wheeler](compression/burrows_wheeler.py)
* [Huffman](compression/huffman.py)
* [Lempel Ziv](compression/lempel_ziv.py)
* [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py)
* [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py)
## Computer Vision
* [Cnn Classification](computer_vision/cnn_classification.py)
* [Flip Augmentation](computer_vision/flip_augmentation.py)
* [Harris Corner](computer_vision/harris_corner.py)
* [Horn Schunck](computer_vision/horn_schunck.py)
* [Mean Threshold](computer_vision/mean_threshold.py)
* [Mosaic Augmentation](computer_vision/mosaic_augmentation.py)
* [Pooling Functions](computer_vision/pooling_functions.py)
## Conversions
* [Binary To Decimal](conversions/binary_to_decimal.py)
* [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py)
* [Binary To Octal](conversions/binary_to_octal.py)
* [Decimal To Any](conversions/decimal_to_any.py)
* [Decimal To Binary](conversions/decimal_to_binary.py)
* [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py)
* [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py)
* [Decimal To Octal](conversions/decimal_to_octal.py)
* [Excel Title To Column](conversions/excel_title_to_column.py)
* [Hex To Bin](conversions/hex_to_bin.py)
* [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py)
* [Length Conversion](conversions/length_conversion.py)
* [Molecular Chemistry](conversions/molecular_chemistry.py)
* [Octal To Decimal](conversions/octal_to_decimal.py)
* [Prefix Conversions](conversions/prefix_conversions.py)
* [Prefix Conversions String](conversions/prefix_conversions_string.py)
* [Pressure Conversions](conversions/pressure_conversions.py)
* [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py)
* [Roman Numerals](conversions/roman_numerals.py)
* [Temperature Conversions](conversions/temperature_conversions.py)
* [Volume Conversions](conversions/volume_conversions.py)
* [Weight Conversion](conversions/weight_conversion.py)
## Data Structures
* Binary Tree
* [Avl Tree](data_structures/binary_tree/avl_tree.py)
* [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py)
* [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py)
* [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py)
* [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py)
* [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py)
* [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py)
* [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py)
* [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py)
* [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py)
* [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py)
* [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py)
* [Red Black Tree](data_structures/binary_tree/red_black_tree.py)
* [Segment Tree](data_structures/binary_tree/segment_tree.py)
* [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py)
* [Treap](data_structures/binary_tree/treap.py)
* [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py)
* Disjoint Set
* [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py)
* [Disjoint Set](data_structures/disjoint_set/disjoint_set.py)
* Hashing
* [Double Hash](data_structures/hashing/double_hash.py)
* [Hash Table](data_structures/hashing/hash_table.py)
* [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py)
* Number Theory
* [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py)
* [Quadratic Probing](data_structures/hashing/quadratic_probing.py)
* Heap
* [Binomial Heap](data_structures/heap/binomial_heap.py)
* [Heap](data_structures/heap/heap.py)
* [Heap Generic](data_structures/heap/heap_generic.py)
* [Max Heap](data_structures/heap/max_heap.py)
* [Min Heap](data_structures/heap/min_heap.py)
* [Randomized Heap](data_structures/heap/randomized_heap.py)
* [Skew Heap](data_structures/heap/skew_heap.py)
* Linked List
* [Circular Linked List](data_structures/linked_list/circular_linked_list.py)
* [Deque Doubly](data_structures/linked_list/deque_doubly.py)
* [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py)
* [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py)
* [From Sequence](data_structures/linked_list/from_sequence.py)
* [Has Loop](data_structures/linked_list/has_loop.py)
* [Is Palindrome](data_structures/linked_list/is_palindrome.py)
* [Merge Two Lists](data_structures/linked_list/merge_two_lists.py)
* [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py)
* [Print Reverse](data_structures/linked_list/print_reverse.py)
* [Singly Linked List](data_structures/linked_list/singly_linked_list.py)
* [Skip List](data_structures/linked_list/skip_list.py)
* [Swap Nodes](data_structures/linked_list/swap_nodes.py)
* Queue
* [Circular Queue](data_structures/queue/circular_queue.py)
* [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py)
* [Double Ended Queue](data_structures/queue/double_ended_queue.py)
* [Linked Queue](data_structures/queue/linked_queue.py)
* [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py)
* [Queue On List](data_structures/queue/queue_on_list.py)
* [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py)
* Stacks
* [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py)
* [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py)
* [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py)
* [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py)
* [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py)
* [Next Greater Element](data_structures/stacks/next_greater_element.py)
* [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py)
* [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py)
* [Stack](data_structures/stacks/stack.py)
* [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py)
* [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py)
* [Stock Span Problem](data_structures/stacks/stock_span_problem.py)
* Trie
* [Trie](data_structures/trie/trie.py)
## Digital Image Processing
* [Change Brightness](digital_image_processing/change_brightness.py)
* [Change Contrast](digital_image_processing/change_contrast.py)
* [Convert To Negative](digital_image_processing/convert_to_negative.py)
* Dithering
* [Burkes](digital_image_processing/dithering/burkes.py)
* Edge Detection
* [Canny](digital_image_processing/edge_detection/canny.py)
* Filters
* [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py)
* [Convolve](digital_image_processing/filters/convolve.py)
* [Gabor Filter](digital_image_processing/filters/gabor_filter.py)
* [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py)
* [Median Filter](digital_image_processing/filters/median_filter.py)
* [Sobel Filter](digital_image_processing/filters/sobel_filter.py)
* Histogram Equalization
* [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py)
* [Index Calculation](digital_image_processing/index_calculation.py)
* Morphological Operations
* [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py)
* [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py)
* Resize
* [Resize](digital_image_processing/resize/resize.py)
* Rotation
* [Rotation](digital_image_processing/rotation/rotation.py)
* [Sepia](digital_image_processing/sepia.py)
* [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py)
## Divide And Conquer
* [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py)
* [Convex Hull](divide_and_conquer/convex_hull.py)
* [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py)
* [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py)
* [Inversions](divide_and_conquer/inversions.py)
* [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py)
* [Max Difference Pair](divide_and_conquer/max_difference_pair.py)
* [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py)
* [Mergesort](divide_and_conquer/mergesort.py)
* [Peak](divide_and_conquer/peak.py)
* [Power](divide_and_conquer/power.py)
* [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py)
## Dynamic Programming
* [Abbreviation](dynamic_programming/abbreviation.py)
* [All Construct](dynamic_programming/all_construct.py)
* [Bitmask](dynamic_programming/bitmask.py)
* [Catalan Numbers](dynamic_programming/catalan_numbers.py)
* [Climbing Stairs](dynamic_programming/climbing_stairs.py)
* [Edit Distance](dynamic_programming/edit_distance.py)
* [Factorial](dynamic_programming/factorial.py)
* [Fast Fibonacci](dynamic_programming/fast_fibonacci.py)
* [Fibonacci](dynamic_programming/fibonacci.py)
* [Floyd Warshall](dynamic_programming/floyd_warshall.py)
* [Fractional Knapsack](dynamic_programming/fractional_knapsack.py)
* [Fractional Knapsack 2](dynamic_programming/fractional_knapsack_2.py)
* [Integer Partition](dynamic_programming/integer_partition.py)
* [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py)
* [Knapsack](dynamic_programming/knapsack.py)
* [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py)
* [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py)
* [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py)
* [Longest Sub Array](dynamic_programming/longest_sub_array.py)
* [Matrix Chain Order](dynamic_programming/matrix_chain_order.py)
* [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py)
* [Max Sub Array](dynamic_programming/max_sub_array.py)
* [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py)
* [Minimum Coin Change](dynamic_programming/minimum_coin_change.py)
* [Minimum Cost Path](dynamic_programming/minimum_cost_path.py)
* [Minimum Partition](dynamic_programming/minimum_partition.py)
* [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py)
* [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py)
* [Rod Cutting](dynamic_programming/rod_cutting.py)
* [Subset Generation](dynamic_programming/subset_generation.py)
* [Sum Of Subset](dynamic_programming/sum_of_subset.py)
## Electronics
* [Carrier Concentration](electronics/carrier_concentration.py)
* [Coulombs Law](electronics/coulombs_law.py)
* [Electric Power](electronics/electric_power.py)
* [Ohms Law](electronics/ohms_law.py)
## File Transfer
* [Receive File](file_transfer/receive_file.py)
* [Send File](file_transfer/send_file.py)
* Tests
* [Test Send File](file_transfer/tests/test_send_file.py)
## Financial
* [Equated Monthly Installments](financial/equated_monthly_installments.py)
* [Interest](financial/interest.py)
## Fractals
* [Julia Sets](fractals/julia_sets.py)
* [Koch Snowflake](fractals/koch_snowflake.py)
* [Mandelbrot](fractals/mandelbrot.py)
* [Sierpinski Triangle](fractals/sierpinski_triangle.py)
## Fuzzy Logic
* [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py)
## Genetic Algorithm
* [Basic String](genetic_algorithm/basic_string.py)
## Geodesy
* [Haversine Distance](geodesy/haversine_distance.py)
* [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py)
## Graphics
* [Bezier Curve](graphics/bezier_curve.py)
* [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py)
## Graphs
* [A Star](graphs/a_star.py)
* [Articulation Points](graphs/articulation_points.py)
* [Basic Graphs](graphs/basic_graphs.py)
* [Bellman Ford](graphs/bellman_ford.py)
* [Bfs Shortest Path](graphs/bfs_shortest_path.py)
* [Bfs Zero One Shortest Path](graphs/bfs_zero_one_shortest_path.py)
* [Bidirectional A Star](graphs/bidirectional_a_star.py)
* [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py)
* [Boruvka](graphs/boruvka.py)
* [Breadth First Search](graphs/breadth_first_search.py)
* [Breadth First Search 2](graphs/breadth_first_search_2.py)
* [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py)
* [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py)
* [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py)
* [Check Cycle](graphs/check_cycle.py)
* [Connected Components](graphs/connected_components.py)
* [Depth First Search](graphs/depth_first_search.py)
* [Depth First Search 2](graphs/depth_first_search_2.py)
* [Dijkstra](graphs/dijkstra.py)
* [Dijkstra 2](graphs/dijkstra_2.py)
* [Dijkstra Algorithm](graphs/dijkstra_algorithm.py)
* [Dinic](graphs/dinic.py)
* [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py)
* [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py)
* [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py)
* [Even Tree](graphs/even_tree.py)
* [Finding Bridges](graphs/finding_bridges.py)
* [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py)
* [G Topological Sort](graphs/g_topological_sort.py)
* [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py)
* [Graph List](graphs/graph_list.py)
* [Graph Matrix](graphs/graph_matrix.py)
* [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py)
* [Greedy Best First](graphs/greedy_best_first.py)
* [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py)
* [Kahns Algorithm Long](graphs/kahns_algorithm_long.py)
* [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py)
* [Karger](graphs/karger.py)
* [Markov Chain](graphs/markov_chain.py)
* [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py)
* [Minimum Path Sum](graphs/minimum_path_sum.py)
* [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py)
* [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py)
* [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py)
* [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py)
* [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py)
* [Multi Heuristic Astar](graphs/multi_heuristic_astar.py)
* [Page Rank](graphs/page_rank.py)
* [Prim](graphs/prim.py)
* [Random Graph Generator](graphs/random_graph_generator.py)
* [Scc Kosaraju](graphs/scc_kosaraju.py)
* [Strongly Connected Components](graphs/strongly_connected_components.py)
* [Tarjans Scc](graphs/tarjans_scc.py)
* Tests
* [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py)
* [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py)
## Greedy Methods
* [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py)
## Hashes
* [Adler32](hashes/adler32.py)
* [Chaos Machine](hashes/chaos_machine.py)
* [Djb2](hashes/djb2.py)
* [Enigma Machine](hashes/enigma_machine.py)
* [Hamming Code](hashes/hamming_code.py)
* [Luhn](hashes/luhn.py)
* [Md5](hashes/md5.py)
* [Sdbm](hashes/sdbm.py)
* [Sha1](hashes/sha1.py)
* [Sha256](hashes/sha256.py)
## Knapsack
* [Greedy Knapsack](knapsack/greedy_knapsack.py)
* [Knapsack](knapsack/knapsack.py)
* Tests
* [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py)
* [Test Knapsack](knapsack/tests/test_knapsack.py)
## Linear Algebra
* Src
* [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py)
* [Lib](linear_algebra/src/lib.py)
* [Polynom For Points](linear_algebra/src/polynom_for_points.py)
* [Power Iteration](linear_algebra/src/power_iteration.py)
* [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py)
* [Schur Complement](linear_algebra/src/schur_complement.py)
* [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py)
* [Transformations 2D](linear_algebra/src/transformations_2d.py)
## Machine Learning
* [Astar](machine_learning/astar.py)
* [Data Transformations](machine_learning/data_transformations.py)
* [Decision Tree](machine_learning/decision_tree.py)
* Forecasting
* [Run](machine_learning/forecasting/run.py)
* [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py)
* [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py)
* [Gradient Descent](machine_learning/gradient_descent.py)
* [K Means Clust](machine_learning/k_means_clust.py)
* [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py)
* [Knn Sklearn](machine_learning/knn_sklearn.py)
* [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py)
* [Linear Regression](machine_learning/linear_regression.py)
* Local Weighted Learning
* [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py)
* [Logistic Regression](machine_learning/logistic_regression.py)
* Lstm
* [Lstm Prediction](machine_learning/lstm/lstm_prediction.py)
* [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py)
* [Polymonial Regression](machine_learning/polymonial_regression.py)
* [Random Forest Classifier](machine_learning/random_forest_classifier.py)
* [Random Forest Regressor](machine_learning/random_forest_regressor.py)
* [Scoring Functions](machine_learning/scoring_functions.py)
* [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py)
* [Similarity Search](machine_learning/similarity_search.py)
* [Support Vector Machines](machine_learning/support_vector_machines.py)
* [Word Frequency Functions](machine_learning/word_frequency_functions.py)
## Maths
* [3N Plus 1](maths/3n_plus_1.py)
* [Abs](maths/abs.py)
* [Abs Max](maths/abs_max.py)
* [Abs Min](maths/abs_min.py)
* [Add](maths/add.py)
* [Aliquot Sum](maths/aliquot_sum.py)
* [Allocation Number](maths/allocation_number.py)
* [Area](maths/area.py)
* [Area Under Curve](maths/area_under_curve.py)
* [Armstrong Numbers](maths/armstrong_numbers.py)
* [Average Absolute Deviation](maths/average_absolute_deviation.py)
* [Average Mean](maths/average_mean.py)
* [Average Median](maths/average_median.py)
* [Average Mode](maths/average_mode.py)
* [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py)
* [Basic Maths](maths/basic_maths.py)
* [Binary Exp Mod](maths/binary_exp_mod.py)
* [Binary Exponentiation](maths/binary_exponentiation.py)
* [Binary Exponentiation 2](maths/binary_exponentiation_2.py)
* [Binary Exponentiation 3](maths/binary_exponentiation_3.py)
* [Binomial Coefficient](maths/binomial_coefficient.py)
* [Binomial Distribution](maths/binomial_distribution.py)
* [Bisection](maths/bisection.py)
* [Ceil](maths/ceil.py)
* [Check Polygon](maths/check_polygon.py)
* [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py)
* [Collatz Sequence](maths/collatz_sequence.py)
* [Combinations](maths/combinations.py)
* [Decimal Isolate](maths/decimal_isolate.py)
* [Double Factorial Iterative](maths/double_factorial_iterative.py)
* [Double Factorial Recursive](maths/double_factorial_recursive.py)
* [Entropy](maths/entropy.py)
* [Euclidean Distance](maths/euclidean_distance.py)
* [Euclidean Gcd](maths/euclidean_gcd.py)
* [Euler Method](maths/euler_method.py)
* [Euler Modified](maths/euler_modified.py)
* [Eulers Totient](maths/eulers_totient.py)
* [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py)
* [Factorial Iterative](maths/factorial_iterative.py)
* [Factorial Recursive](maths/factorial_recursive.py)
* [Factors](maths/factors.py)
* [Fermat Little Theorem](maths/fermat_little_theorem.py)
* [Fibonacci](maths/fibonacci.py)
* [Find Max](maths/find_max.py)
* [Find Max Recursion](maths/find_max_recursion.py)
* [Find Min](maths/find_min.py)
* [Find Min Recursion](maths/find_min_recursion.py)
* [Floor](maths/floor.py)
* [Gamma](maths/gamma.py)
* [Gamma Recursive](maths/gamma_recursive.py)
* [Gaussian](maths/gaussian.py)
* [Greatest Common Divisor](maths/greatest_common_divisor.py)
* [Greedy Coin Change](maths/greedy_coin_change.py)
* [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py)
* [Integration By Simpson Approx](maths/integration_by_simpson_approx.py)
* [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py)
* [Is Square Free](maths/is_square_free.py)
* [Jaccard Similarity](maths/jaccard_similarity.py)
* [Kadanes](maths/kadanes.py)
* [Karatsuba](maths/karatsuba.py)
* [Krishnamurthy Number](maths/krishnamurthy_number.py)
* [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py)
* [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py)
* [Largest Subarray Sum](maths/largest_subarray_sum.py)
* [Least Common Multiple](maths/least_common_multiple.py)
* [Line Length](maths/line_length.py)
* [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py)
* [Lucas Series](maths/lucas_series.py)
* [Matrix Exponentiation](maths/matrix_exponentiation.py)
* [Max Sum Sliding Window](maths/max_sum_sliding_window.py)
* [Median Of Two Arrays](maths/median_of_two_arrays.py)
* [Miller Rabin](maths/miller_rabin.py)
* [Mobius Function](maths/mobius_function.py)
* [Modular Exponential](maths/modular_exponential.py)
* [Monte Carlo](maths/monte_carlo.py)
* [Monte Carlo Dice](maths/monte_carlo_dice.py)
* [Nevilles Method](maths/nevilles_method.py)
* [Newton Raphson](maths/newton_raphson.py)
* [Number Of Digits](maths/number_of_digits.py)
* [Numerical Integration](maths/numerical_integration.py)
* [Perfect Cube](maths/perfect_cube.py)
* [Perfect Number](maths/perfect_number.py)
* [Perfect Square](maths/perfect_square.py)
* [Persistence](maths/persistence.py)
* [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py)
* [Points Are Collinear 3D](maths/points_are_collinear_3d.py)
* [Pollard Rho](maths/pollard_rho.py)
* [Polynomial Evaluation](maths/polynomial_evaluation.py)
* [Power Using Recursion](maths/power_using_recursion.py)
* [Prime Check](maths/prime_check.py)
* [Prime Factors](maths/prime_factors.py)
* [Prime Numbers](maths/prime_numbers.py)
* [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py)
* [Primelib](maths/primelib.py)
* [Proth Number](maths/proth_number.py)
* [Pythagoras](maths/pythagoras.py)
* [Qr Decomposition](maths/qr_decomposition.py)
* [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py)
* [Radians](maths/radians.py)
* [Radix2 Fft](maths/radix2_fft.py)
* [Relu](maths/relu.py)
* [Runge Kutta](maths/runge_kutta.py)
* [Segmented Sieve](maths/segmented_sieve.py)
* Series
* [Arithmetic](maths/series/arithmetic.py)
* [Geometric](maths/series/geometric.py)
* [Geometric Series](maths/series/geometric_series.py)
* [Harmonic](maths/series/harmonic.py)
* [Harmonic Series](maths/series/harmonic_series.py)
* [Hexagonal Numbers](maths/series/hexagonal_numbers.py)
* [P Series](maths/series/p_series.py)
* [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py)
* [Sigmoid](maths/sigmoid.py)
* [Simpson Rule](maths/simpson_rule.py)
* [Sin](maths/sin.py)
* [Sock Merchant](maths/sock_merchant.py)
* [Softmax](maths/softmax.py)
* [Square Root](maths/square_root.py)
* [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py)
* [Sum Of Digits](maths/sum_of_digits.py)
* [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py)
* [Sylvester Sequence](maths/sylvester_sequence.py)
* [Test Prime Check](maths/test_prime_check.py)
* [Trapezoidal Rule](maths/trapezoidal_rule.py)
* [Triplet Sum](maths/triplet_sum.py)
* [Two Pointer](maths/two_pointer.py)
* [Two Sum](maths/two_sum.py)
* [Ugly Numbers](maths/ugly_numbers.py)
* [Volume](maths/volume.py)
* [Zellers Congruence](maths/zellers_congruence.py)
## Matrix
* [Count Islands In Matrix](matrix/count_islands_in_matrix.py)
* [Inverse Of Matrix](matrix/inverse_of_matrix.py)
* [Matrix Class](matrix/matrix_class.py)
* [Matrix Operation](matrix/matrix_operation.py)
* [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py)
* [Rotate Matrix](matrix/rotate_matrix.py)
* [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py)
* [Sherman Morrison](matrix/sherman_morrison.py)
* [Spiral Print](matrix/spiral_print.py)
* Tests
* [Test Matrix Operation](matrix/tests/test_matrix_operation.py)
## Networking Flow
* [Ford Fulkerson](networking_flow/ford_fulkerson.py)
* [Minimum Cut](networking_flow/minimum_cut.py)
## Neural Network
* [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py)
* [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py)
* [Convolution Neural Network](neural_network/convolution_neural_network.py)
* [Perceptron](neural_network/perceptron.py)
## Other
* [Activity Selection](other/activity_selection.py)
* [Alternative List Arrange](other/alternative_list_arrange.py)
* [Check Strong Password](other/check_strong_password.py)
* [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py)
* [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py)
* [Doomsday](other/doomsday.py)
* [Fischer Yates Shuffle](other/fischer_yates_shuffle.py)
* [Gauss Easter](other/gauss_easter.py)
* [Graham Scan](other/graham_scan.py)
* [Greedy](other/greedy.py)
* [Least Recently Used](other/least_recently_used.py)
* [Lfu Cache](other/lfu_cache.py)
* [Linear Congruential Generator](other/linear_congruential_generator.py)
* [Lru Cache](other/lru_cache.py)
* [Magicdiamondpattern](other/magicdiamondpattern.py)
* [Nested Brackets](other/nested_brackets.py)
* [Password Generator](other/password_generator.py)
* [Scoring Algorithm](other/scoring_algorithm.py)
* [Sdes](other/sdes.py)
* [Tower Of Hanoi](other/tower_of_hanoi.py)
## Physics
* [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py)
* [Lorenz Transformation Four Vector](physics/lorenz_transformation_four_vector.py)
* [N Body Simulation](physics/n_body_simulation.py)
* [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py)
## Project Euler
* Problem 001
* [Sol1](project_euler/problem_001/sol1.py)
* [Sol2](project_euler/problem_001/sol2.py)
* [Sol3](project_euler/problem_001/sol3.py)
* [Sol4](project_euler/problem_001/sol4.py)
* [Sol5](project_euler/problem_001/sol5.py)
* [Sol6](project_euler/problem_001/sol6.py)
* [Sol7](project_euler/problem_001/sol7.py)
* Problem 002
* [Sol1](project_euler/problem_002/sol1.py)
* [Sol2](project_euler/problem_002/sol2.py)
* [Sol3](project_euler/problem_002/sol3.py)
* [Sol4](project_euler/problem_002/sol4.py)
* [Sol5](project_euler/problem_002/sol5.py)
* Problem 003
* [Sol1](project_euler/problem_003/sol1.py)
* [Sol2](project_euler/problem_003/sol2.py)
* [Sol3](project_euler/problem_003/sol3.py)
* Problem 004
* [Sol1](project_euler/problem_004/sol1.py)
* [Sol2](project_euler/problem_004/sol2.py)
* Problem 005
* [Sol1](project_euler/problem_005/sol1.py)
* [Sol2](project_euler/problem_005/sol2.py)
* Problem 006
* [Sol1](project_euler/problem_006/sol1.py)
* [Sol2](project_euler/problem_006/sol2.py)
* [Sol3](project_euler/problem_006/sol3.py)
* [Sol4](project_euler/problem_006/sol4.py)
* Problem 007
* [Sol1](project_euler/problem_007/sol1.py)
* [Sol2](project_euler/problem_007/sol2.py)
* [Sol3](project_euler/problem_007/sol3.py)
* Problem 008
* [Sol1](project_euler/problem_008/sol1.py)
* [Sol2](project_euler/problem_008/sol2.py)
* [Sol3](project_euler/problem_008/sol3.py)
* Problem 009
* [Sol1](project_euler/problem_009/sol1.py)
* [Sol2](project_euler/problem_009/sol2.py)
* [Sol3](project_euler/problem_009/sol3.py)
* Problem 010
* [Sol1](project_euler/problem_010/sol1.py)
* [Sol2](project_euler/problem_010/sol2.py)
* [Sol3](project_euler/problem_010/sol3.py)
* Problem 011
* [Sol1](project_euler/problem_011/sol1.py)
* [Sol2](project_euler/problem_011/sol2.py)
* Problem 012
* [Sol1](project_euler/problem_012/sol1.py)
* [Sol2](project_euler/problem_012/sol2.py)
* Problem 013
* [Sol1](project_euler/problem_013/sol1.py)
* Problem 014
* [Sol1](project_euler/problem_014/sol1.py)
* [Sol2](project_euler/problem_014/sol2.py)
* Problem 015
* [Sol1](project_euler/problem_015/sol1.py)
* Problem 016
* [Sol1](project_euler/problem_016/sol1.py)
* [Sol2](project_euler/problem_016/sol2.py)
* Problem 017
* [Sol1](project_euler/problem_017/sol1.py)
* Problem 018
* [Solution](project_euler/problem_018/solution.py)
* Problem 019
* [Sol1](project_euler/problem_019/sol1.py)
* Problem 020
* [Sol1](project_euler/problem_020/sol1.py)
* [Sol2](project_euler/problem_020/sol2.py)
* [Sol3](project_euler/problem_020/sol3.py)
* [Sol4](project_euler/problem_020/sol4.py)
* Problem 021
* [Sol1](project_euler/problem_021/sol1.py)
* Problem 022
* [Sol1](project_euler/problem_022/sol1.py)
* [Sol2](project_euler/problem_022/sol2.py)
* Problem 023
* [Sol1](project_euler/problem_023/sol1.py)
* Problem 024
* [Sol1](project_euler/problem_024/sol1.py)
* Problem 025
* [Sol1](project_euler/problem_025/sol1.py)
* [Sol2](project_euler/problem_025/sol2.py)
* [Sol3](project_euler/problem_025/sol3.py)
* Problem 026
* [Sol1](project_euler/problem_026/sol1.py)
* Problem 027
* [Sol1](project_euler/problem_027/sol1.py)
* Problem 028
* [Sol1](project_euler/problem_028/sol1.py)
* Problem 029
* [Sol1](project_euler/problem_029/sol1.py)
* Problem 030
* [Sol1](project_euler/problem_030/sol1.py)
* Problem 031
* [Sol1](project_euler/problem_031/sol1.py)
* [Sol2](project_euler/problem_031/sol2.py)
* Problem 032
* [Sol32](project_euler/problem_032/sol32.py)
* Problem 033
* [Sol1](project_euler/problem_033/sol1.py)
* Problem 034
* [Sol1](project_euler/problem_034/sol1.py)
* Problem 035
* [Sol1](project_euler/problem_035/sol1.py)
* Problem 036
* [Sol1](project_euler/problem_036/sol1.py)
* Problem 037
* [Sol1](project_euler/problem_037/sol1.py)
* Problem 038
* [Sol1](project_euler/problem_038/sol1.py)
* Problem 039
* [Sol1](project_euler/problem_039/sol1.py)
* Problem 040
* [Sol1](project_euler/problem_040/sol1.py)
* Problem 041
* [Sol1](project_euler/problem_041/sol1.py)
* Problem 042
* [Solution42](project_euler/problem_042/solution42.py)
* Problem 043
* [Sol1](project_euler/problem_043/sol1.py)
* Problem 044
* [Sol1](project_euler/problem_044/sol1.py)
* Problem 045
* [Sol1](project_euler/problem_045/sol1.py)
* Problem 046
* [Sol1](project_euler/problem_046/sol1.py)
* Problem 047
* [Sol1](project_euler/problem_047/sol1.py)
* Problem 048
* [Sol1](project_euler/problem_048/sol1.py)
* Problem 049
* [Sol1](project_euler/problem_049/sol1.py)
* Problem 050
* [Sol1](project_euler/problem_050/sol1.py)
* Problem 051
* [Sol1](project_euler/problem_051/sol1.py)
* Problem 052
* [Sol1](project_euler/problem_052/sol1.py)
* Problem 053
* [Sol1](project_euler/problem_053/sol1.py)
* Problem 054
* [Sol1](project_euler/problem_054/sol1.py)
* [Test Poker Hand](project_euler/problem_054/test_poker_hand.py)
* Problem 055
* [Sol1](project_euler/problem_055/sol1.py)
* Problem 056
* [Sol1](project_euler/problem_056/sol1.py)
* Problem 057
* [Sol1](project_euler/problem_057/sol1.py)
* Problem 058
* [Sol1](project_euler/problem_058/sol1.py)
* Problem 059
* [Sol1](project_euler/problem_059/sol1.py)
* Problem 062
* [Sol1](project_euler/problem_062/sol1.py)
* Problem 063
* [Sol1](project_euler/problem_063/sol1.py)
* Problem 064
* [Sol1](project_euler/problem_064/sol1.py)
* Problem 065
* [Sol1](project_euler/problem_065/sol1.py)
* Problem 067
* [Sol1](project_euler/problem_067/sol1.py)
* [Sol2](project_euler/problem_067/sol2.py)
* Problem 068
* [Sol1](project_euler/problem_068/sol1.py)
* Problem 069
* [Sol1](project_euler/problem_069/sol1.py)
* Problem 070
* [Sol1](project_euler/problem_070/sol1.py)
* Problem 071
* [Sol1](project_euler/problem_071/sol1.py)
* Problem 072
* [Sol1](project_euler/problem_072/sol1.py)
* [Sol2](project_euler/problem_072/sol2.py)
* Problem 074
* [Sol1](project_euler/problem_074/sol1.py)
* [Sol2](project_euler/problem_074/sol2.py)
* Problem 075
* [Sol1](project_euler/problem_075/sol1.py)
* Problem 076
* [Sol1](project_euler/problem_076/sol1.py)
* Problem 077
* [Sol1](project_euler/problem_077/sol1.py)
* Problem 078
* [Sol1](project_euler/problem_078/sol1.py)
* Problem 080
* [Sol1](project_euler/problem_080/sol1.py)
* Problem 081
* [Sol1](project_euler/problem_081/sol1.py)
* Problem 085
* [Sol1](project_euler/problem_085/sol1.py)
* Problem 086
* [Sol1](project_euler/problem_086/sol1.py)
* Problem 087
* [Sol1](project_euler/problem_087/sol1.py)
* Problem 089
* [Sol1](project_euler/problem_089/sol1.py)
* Problem 091
* [Sol1](project_euler/problem_091/sol1.py)
* Problem 092
* [Sol1](project_euler/problem_092/sol1.py)
* Problem 097
* [Sol1](project_euler/problem_097/sol1.py)
* Problem 099
* [Sol1](project_euler/problem_099/sol1.py)
* Problem 101
* [Sol1](project_euler/problem_101/sol1.py)
* Problem 102
* [Sol1](project_euler/problem_102/sol1.py)
* Problem 104
* [Sol](project_euler/problem_104/sol.py)
* Problem 107
* [Sol1](project_euler/problem_107/sol1.py)
* Problem 109
* [Sol1](project_euler/problem_109/sol1.py)
* Problem 112
* [Sol1](project_euler/problem_112/sol1.py)
* Problem 113
* [Sol1](project_euler/problem_113/sol1.py)
* Problem 119
* [Sol1](project_euler/problem_119/sol1.py)
* Problem 120
* [Sol1](project_euler/problem_120/sol1.py)
* Problem 121
* [Sol1](project_euler/problem_121/sol1.py)
* Problem 123
* [Sol1](project_euler/problem_123/sol1.py)
* Problem 125
* [Sol1](project_euler/problem_125/sol1.py)
* Problem 129
* [Sol1](project_euler/problem_129/sol1.py)
* Problem 135
* [Sol1](project_euler/problem_135/sol1.py)
* Problem 144
* [Sol1](project_euler/problem_144/sol1.py)
* Problem 145
* [Sol1](project_euler/problem_145/sol1.py)
* Problem 173
* [Sol1](project_euler/problem_173/sol1.py)
* Problem 174
* [Sol1](project_euler/problem_174/sol1.py)
* Problem 180
* [Sol1](project_euler/problem_180/sol1.py)
* Problem 188
* [Sol1](project_euler/problem_188/sol1.py)
* Problem 191
* [Sol1](project_euler/problem_191/sol1.py)
* Problem 203
* [Sol1](project_euler/problem_203/sol1.py)
* Problem 205
* [Sol1](project_euler/problem_205/sol1.py)
* Problem 206
* [Sol1](project_euler/problem_206/sol1.py)
* Problem 207
* [Sol1](project_euler/problem_207/sol1.py)
* Problem 234
* [Sol1](project_euler/problem_234/sol1.py)
* Problem 301
* [Sol1](project_euler/problem_301/sol1.py)
* Problem 493
* [Sol1](project_euler/problem_493/sol1.py)
* Problem 551
* [Sol1](project_euler/problem_551/sol1.py)
* Problem 686
* [Sol1](project_euler/problem_686/sol1.py)
## Quantum
* [Deutsch Jozsa](quantum/deutsch_jozsa.py)
* [Half Adder](quantum/half_adder.py)
* [Not Gate](quantum/not_gate.py)
* [Quantum Entanglement](quantum/quantum_entanglement.py)
* [Ripple Adder Classic](quantum/ripple_adder_classic.py)
* [Single Qubit Measure](quantum/single_qubit_measure.py)
## Scheduling
* [First Come First Served](scheduling/first_come_first_served.py)
* [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py)
* [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py)
* [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py)
* [Round Robin](scheduling/round_robin.py)
* [Shortest Job First](scheduling/shortest_job_first.py)
## Searches
* [Binary Search](searches/binary_search.py)
* [Binary Tree Traversal](searches/binary_tree_traversal.py)
* [Double Linear Search](searches/double_linear_search.py)
* [Double Linear Search Recursion](searches/double_linear_search_recursion.py)
* [Fibonacci Search](searches/fibonacci_search.py)
* [Hill Climbing](searches/hill_climbing.py)
* [Interpolation Search](searches/interpolation_search.py)
* [Jump Search](searches/jump_search.py)
* [Linear Search](searches/linear_search.py)
* [Quick Select](searches/quick_select.py)
* [Sentinel Linear Search](searches/sentinel_linear_search.py)
* [Simple Binary Search](searches/simple_binary_search.py)
* [Simulated Annealing](searches/simulated_annealing.py)
* [Tabu Search](searches/tabu_search.py)
* [Ternary Search](searches/ternary_search.py)
## Sorts
* [Bead Sort](sorts/bead_sort.py)
* [Bitonic Sort](sorts/bitonic_sort.py)
* [Bogo Sort](sorts/bogo_sort.py)
* [Bubble Sort](sorts/bubble_sort.py)
* [Bucket Sort](sorts/bucket_sort.py)
* [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py)
* [Comb Sort](sorts/comb_sort.py)
* [Counting Sort](sorts/counting_sort.py)
* [Cycle Sort](sorts/cycle_sort.py)
* [Double Sort](sorts/double_sort.py)
* [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py)
* [Exchange Sort](sorts/exchange_sort.py)
* [External Sort](sorts/external_sort.py)
* [Gnome Sort](sorts/gnome_sort.py)
* [Heap Sort](sorts/heap_sort.py)
* [Insertion Sort](sorts/insertion_sort.py)
* [Intro Sort](sorts/intro_sort.py)
* [Iterative Merge Sort](sorts/iterative_merge_sort.py)
* [Merge Insertion Sort](sorts/merge_insertion_sort.py)
* [Merge Sort](sorts/merge_sort.py)
* [Msd Radix Sort](sorts/msd_radix_sort.py)
* [Natural Sort](sorts/natural_sort.py)
* [Odd Even Sort](sorts/odd_even_sort.py)
* [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py)
* [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py)
* [Pancake Sort](sorts/pancake_sort.py)
* [Patience Sort](sorts/patience_sort.py)
* [Pigeon Sort](sorts/pigeon_sort.py)
* [Pigeonhole Sort](sorts/pigeonhole_sort.py)
* [Quick Sort](sorts/quick_sort.py)
* [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py)
* [Radix Sort](sorts/radix_sort.py)
* [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py)
* [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py)
* [Recursive Bubble Sort](sorts/recursive_bubble_sort.py)
* [Recursive Insertion Sort](sorts/recursive_insertion_sort.py)
* [Recursive Mergesort Array](sorts/recursive_mergesort_array.py)
* [Recursive Quick Sort](sorts/recursive_quick_sort.py)
* [Selection Sort](sorts/selection_sort.py)
* [Shell Sort](sorts/shell_sort.py)
* [Slowsort](sorts/slowsort.py)
* [Stooge Sort](sorts/stooge_sort.py)
* [Strand Sort](sorts/strand_sort.py)
* [Tim Sort](sorts/tim_sort.py)
* [Topological Sort](sorts/topological_sort.py)
* [Tree Sort](sorts/tree_sort.py)
* [Unknown Sort](sorts/unknown_sort.py)
* [Wiggle Sort](sorts/wiggle_sort.py)
## Strings
* [Aho Corasick](strings/aho_corasick.py)
* [Alternative String Arrange](strings/alternative_string_arrange.py)
* [Anagrams](strings/anagrams.py)
* [Autocomplete Using Trie](strings/autocomplete_using_trie.py)
* [Boyer Moore Search](strings/boyer_moore_search.py)
* [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py)
* [Capitalize](strings/capitalize.py)
* [Check Anagrams](strings/check_anagrams.py)
* [Check Pangram](strings/check_pangram.py)
* [Credit Card Validator](strings/credit_card_validator.py)
* [Detecting English Programmatically](strings/detecting_english_programmatically.py)
* [Frequency Finder](strings/frequency_finder.py)
* [Hamming Distance](strings/hamming_distance.py)
* [Indian Phone Validator](strings/indian_phone_validator.py)
* [Is Contains Unique Chars](strings/is_contains_unique_chars.py)
* [Is Palindrome](strings/is_palindrome.py)
* [Jaro Winkler](strings/jaro_winkler.py)
* [Join](strings/join.py)
* [Knuth Morris Pratt](strings/knuth_morris_pratt.py)
* [Levenshtein Distance](strings/levenshtein_distance.py)
* [Lower](strings/lower.py)
* [Manacher](strings/manacher.py)
* [Min Cost String Conversion](strings/min_cost_string_conversion.py)
* [Naive String Search](strings/naive_string_search.py)
* [Ngram](strings/ngram.py)
* [Palindrome](strings/palindrome.py)
* [Prefix Function](strings/prefix_function.py)
* [Rabin Karp](strings/rabin_karp.py)
* [Remove Duplicate](strings/remove_duplicate.py)
* [Reverse Letters](strings/reverse_letters.py)
* [Reverse Long Words](strings/reverse_long_words.py)
* [Reverse Words](strings/reverse_words.py)
* [Split](strings/split.py)
* [Upper](strings/upper.py)
* [Wave](strings/wave.py)
* [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py)
* [Word Occurrence](strings/word_occurrence.py)
* [Word Patterns](strings/word_patterns.py)
* [Z Function](strings/z_function.py)
## Web Programming
* [Co2 Emission](web_programming/co2_emission.py)
* [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py)
* [Crawl Google Results](web_programming/crawl_google_results.py)
* [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py)
* [Currency Converter](web_programming/currency_converter.py)
* [Current Stock Price](web_programming/current_stock_price.py)
* [Current Weather](web_programming/current_weather.py)
* [Daily Horoscope](web_programming/daily_horoscope.py)
* [Download Images From Google Query](web_programming/download_images_from_google_query.py)
* [Emails From Url](web_programming/emails_from_url.py)
* [Fetch Anime And Play](web_programming/fetch_anime_and_play.py)
* [Fetch Bbc News](web_programming/fetch_bbc_news.py)
* [Fetch Github Info](web_programming/fetch_github_info.py)
* [Fetch Jobs](web_programming/fetch_jobs.py)
* [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py)
* [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py)
* [Get Imdbtop](web_programming/get_imdbtop.py)
* [Get Top Hn Posts](web_programming/get_top_hn_posts.py)
* [Get User Tweets](web_programming/get_user_tweets.py)
* [Giphy](web_programming/giphy.py)
* [Instagram Crawler](web_programming/instagram_crawler.py)
* [Instagram Pic](web_programming/instagram_pic.py)
* [Instagram Video](web_programming/instagram_video.py)
* [Nasa Data](web_programming/nasa_data.py)
* [Random Anime Character](web_programming/random_anime_character.py)
* [Recaptcha Verification](web_programming/recaptcha_verification.py)
* [Reddit](web_programming/reddit.py)
* [Search Books By Isbn](web_programming/search_books_by_isbn.py)
* [Slack Message](web_programming/slack_message.py)
* [Test Fetch Github Info](web_programming/test_fetch_github_info.py)
* [World Covid19 Stats](web_programming/world_covid19_stats.py)
| 1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
If we are presented with the first k terms of a sequence it is impossible to say with
certainty the value of the next term, as there are infinitely many polynomial functions
that can model the sequence.
As an example, let us consider the sequence of cube
numbers. This is defined by the generating function,
u(n) = n3: 1, 8, 27, 64, 125, 216, ...
Suppose we were only given the first two terms of this sequence. Working on the
principle that "simple is best" we should assume a linear relationship and predict the
next term to be 15 (common difference 7). Even if we were presented with the first three
terms, by the same principle of simplicity, a quadratic relationship should be
assumed.
We shall define OP(k, n) to be the nth term of the optimum polynomial
generating function for the first k terms of a sequence. It should be clear that
OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially
the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a
bad OP (BOP).
As a basis, if we were only given the first term of sequence, it would be most
sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1).
Hence we obtain the
following OPs for the cubic sequence:
OP(1, n) = 1 1, 1, 1, 1, ...
OP(2, n) = 7n-6 1, 8, 15, ...
OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ...
OP(4, n) = n^3 1, 8, 27, 64, 125, ...
Clearly no BOPs exist for k ≥ 4.
By considering the sum of FITs generated by the BOPs (indicated in red above), we
obtain 1 + 15 + 58 = 74.
Consider the following tenth degree polynomial generating function:
1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10
Find the sum of FITs for the BOPs.
"""
from __future__ import annotations
from collections.abc import Callable
from typing import Union
Matrix = list[list[Union[float, int]]]
def solve(matrix: Matrix, vector: Matrix) -> Matrix:
"""
Solve the linear system of equations Ax = b (A = "matrix", b = "vector")
for x using Gaussian elimination and back substitution. We assume that A
is an invertible square matrix and that b is a column vector of the
same height.
>>> solve([[1, 0], [0, 1]], [[1],[2]])
[[1.0], [2.0]]
>>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]])
[[2.0], [3.0], [-1.0]]
"""
size: int = len(matrix)
augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)]
row: int
row2: int
col: int
col2: int
pivot_row: int
ratio: float
for row in range(size):
for col in range(size):
augmented[row][col] = matrix[row][col]
augmented[row][size] = vector[row][0]
row = 0
col = 0
while row < size and col < size:
# pivoting
pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[
1
]
if augmented[pivot_row][col] == 0:
col += 1
continue
else:
augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row]
for row2 in range(row + 1, size):
ratio = augmented[row2][col] / augmented[row][col]
augmented[row2][col] = 0
for col2 in range(col + 1, size + 1):
augmented[row2][col2] -= augmented[row][col2] * ratio
row += 1
col += 1
# back substitution
for col in range(1, size):
for row in range(col):
ratio = augmented[row][col] / augmented[col][col]
for col2 in range(col, size + 1):
augmented[row][col2] -= augmented[col][col2] * ratio
# round to get rid of numbers like 2.000000000000004
return [
[round(augmented[row][size] / augmented[row][row], 10)] for row in range(size)
]
def interpolate(y_list: list[int]) -> Callable[[int], int]:
"""
Given a list of data points (1,y0),(2,y1), ..., return a function that
interpolates the data points. We find the coefficients of the interpolating
polynomial by solving a system of linear equations corresponding to
x = 1, 2, 3...
>>> interpolate([1])(3)
1
>>> interpolate([1, 8])(3)
15
>>> interpolate([1, 8, 27])(4)
58
>>> interpolate([1, 8, 27, 64])(6)
216
"""
size: int = len(y_list)
matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)]
vector: Matrix = [[0] for _ in range(size)]
coeffs: Matrix
x_val: int
y_val: int
col: int
for x_val, y_val in enumerate(y_list):
for col in range(size):
matrix[x_val][col] = (x_val + 1) ** (size - col - 1)
vector[x_val][0] = y_val
coeffs = solve(matrix, vector)
def interpolated_func(var: int) -> int:
"""
>>> interpolate([1])(3)
1
>>> interpolate([1, 8])(3)
15
>>> interpolate([1, 8, 27])(4)
58
>>> interpolate([1, 8, 27, 64])(6)
216
"""
return sum(
round(coeffs[x_val][0]) * (var ** (size - x_val - 1))
for x_val in range(size)
)
return interpolated_func
def question_function(variable: int) -> int:
"""
The generating function u as specified in the question.
>>> question_function(0)
1
>>> question_function(1)
1
>>> question_function(5)
8138021
>>> question_function(10)
9090909091
"""
return (
1
- variable
+ variable**2
- variable**3
+ variable**4
- variable**5
+ variable**6
- variable**7
+ variable**8
- variable**9
+ variable**10
)
def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int:
"""
Find the sum of the FITs of the BOPS. For each interpolating polynomial of order
1, 2, ... , 10, find the first x such that the value of the polynomial at x does
not equal u(x).
>>> solution(lambda n: n ** 3, 3)
74
"""
data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)]
polynomials: list[Callable[[int], int]] = [
interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1)
]
ret: int = 0
poly: Callable[[int], int]
x_val: int
for poly in polynomials:
x_val = 1
while func(x_val) == poly(x_val):
x_val += 1
ret += poly(x_val)
return ret
if __name__ == "__main__":
print(f"{solution() = }")
| """
If we are presented with the first k terms of a sequence it is impossible to say with
certainty the value of the next term, as there are infinitely many polynomial functions
that can model the sequence.
As an example, let us consider the sequence of cube
numbers. This is defined by the generating function,
u(n) = n3: 1, 8, 27, 64, 125, 216, ...
Suppose we were only given the first two terms of this sequence. Working on the
principle that "simple is best" we should assume a linear relationship and predict the
next term to be 15 (common difference 7). Even if we were presented with the first three
terms, by the same principle of simplicity, a quadratic relationship should be
assumed.
We shall define OP(k, n) to be the nth term of the optimum polynomial
generating function for the first k terms of a sequence. It should be clear that
OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially
the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a
bad OP (BOP).
As a basis, if we were only given the first term of sequence, it would be most
sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1).
Hence we obtain the
following OPs for the cubic sequence:
OP(1, n) = 1 1, 1, 1, 1, ...
OP(2, n) = 7n-6 1, 8, 15, ...
OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ...
OP(4, n) = n^3 1, 8, 27, 64, 125, ...
Clearly no BOPs exist for k ≥ 4.
By considering the sum of FITs generated by the BOPs (indicated in red above), we
obtain 1 + 15 + 58 = 74.
Consider the following tenth degree polynomial generating function:
1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10
Find the sum of FITs for the BOPs.
"""
from __future__ import annotations
from collections.abc import Callable
Matrix = list[list[float | int]]
def solve(matrix: Matrix, vector: Matrix) -> Matrix:
"""
Solve the linear system of equations Ax = b (A = "matrix", b = "vector")
for x using Gaussian elimination and back substitution. We assume that A
is an invertible square matrix and that b is a column vector of the
same height.
>>> solve([[1, 0], [0, 1]], [[1],[2]])
[[1.0], [2.0]]
>>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]])
[[2.0], [3.0], [-1.0]]
"""
size: int = len(matrix)
augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)]
row: int
row2: int
col: int
col2: int
pivot_row: int
ratio: float
for row in range(size):
for col in range(size):
augmented[row][col] = matrix[row][col]
augmented[row][size] = vector[row][0]
row = 0
col = 0
while row < size and col < size:
# pivoting
pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[
1
]
if augmented[pivot_row][col] == 0:
col += 1
continue
else:
augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row]
for row2 in range(row + 1, size):
ratio = augmented[row2][col] / augmented[row][col]
augmented[row2][col] = 0
for col2 in range(col + 1, size + 1):
augmented[row2][col2] -= augmented[row][col2] * ratio
row += 1
col += 1
# back substitution
for col in range(1, size):
for row in range(col):
ratio = augmented[row][col] / augmented[col][col]
for col2 in range(col, size + 1):
augmented[row][col2] -= augmented[col][col2] * ratio
# round to get rid of numbers like 2.000000000000004
return [
[round(augmented[row][size] / augmented[row][row], 10)] for row in range(size)
]
def interpolate(y_list: list[int]) -> Callable[[int], int]:
"""
Given a list of data points (1,y0),(2,y1), ..., return a function that
interpolates the data points. We find the coefficients of the interpolating
polynomial by solving a system of linear equations corresponding to
x = 1, 2, 3...
>>> interpolate([1])(3)
1
>>> interpolate([1, 8])(3)
15
>>> interpolate([1, 8, 27])(4)
58
>>> interpolate([1, 8, 27, 64])(6)
216
"""
size: int = len(y_list)
matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)]
vector: Matrix = [[0] for _ in range(size)]
coeffs: Matrix
x_val: int
y_val: int
col: int
for x_val, y_val in enumerate(y_list):
for col in range(size):
matrix[x_val][col] = (x_val + 1) ** (size - col - 1)
vector[x_val][0] = y_val
coeffs = solve(matrix, vector)
def interpolated_func(var: int) -> int:
"""
>>> interpolate([1])(3)
1
>>> interpolate([1, 8])(3)
15
>>> interpolate([1, 8, 27])(4)
58
>>> interpolate([1, 8, 27, 64])(6)
216
"""
return sum(
round(coeffs[x_val][0]) * (var ** (size - x_val - 1))
for x_val in range(size)
)
return interpolated_func
def question_function(variable: int) -> int:
"""
The generating function u as specified in the question.
>>> question_function(0)
1
>>> question_function(1)
1
>>> question_function(5)
8138021
>>> question_function(10)
9090909091
"""
return (
1
- variable
+ variable**2
- variable**3
+ variable**4
- variable**5
+ variable**6
- variable**7
+ variable**8
- variable**9
+ variable**10
)
def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int:
"""
Find the sum of the FITs of the BOPS. For each interpolating polynomial of order
1, 2, ... , 10, find the first x such that the value of the polynomial at x does
not equal u(x).
>>> solution(lambda n: n ** 3, 3)
74
"""
data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)]
polynomials: list[Callable[[int], int]] = [
interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1)
]
ret: int = 0
poly: Callable[[int], int]
x_val: int
for poly in polynomials:
x_val = 1
while func(x_val) == poly(x_val):
x_val += 1
ret += poly(x_val)
return ret
if __name__ == "__main__":
print(f"{solution() = }")
| 1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Conversion of weight units.
__author__ = "Anubhav Solanki"
__license__ = "MIT"
__version__ = "1.1.0"
__maintainer__ = "Anubhav Solanki"
__email__ = "[email protected]"
USAGE :
-> Import this file into their respective project.
-> Use the function weight_conversion() for conversion of weight units.
-> Parameters :
-> from_type : From which type you want to convert
-> to_type : To which type you want to convert
-> value : the value which you want to convert
REFERENCES :
-> Wikipedia reference: https://en.wikipedia.org/wiki/Kilogram
-> Wikipedia reference: https://en.wikipedia.org/wiki/Gram
-> Wikipedia reference: https://en.wikipedia.org/wiki/Millimetre
-> Wikipedia reference: https://en.wikipedia.org/wiki/Tonne
-> Wikipedia reference: https://en.wikipedia.org/wiki/Long_ton
-> Wikipedia reference: https://en.wikipedia.org/wiki/Short_ton
-> Wikipedia reference: https://en.wikipedia.org/wiki/Pound
-> Wikipedia reference: https://en.wikipedia.org/wiki/Ounce
-> Wikipedia reference: https://en.wikipedia.org/wiki/Fineness#Karat
-> Wikipedia reference: https://en.wikipedia.org/wiki/Dalton_(unit)
-> Wikipedia reference: https://en.wikipedia.org/wiki/Stone_(unit)
"""
KILOGRAM_CHART: dict[str, float] = {
"kilogram": 1,
"gram": pow(10, 3),
"milligram": pow(10, 6),
"metric-ton": pow(10, -3),
"long-ton": 0.0009842073,
"short-ton": 0.0011023122,
"pound": 2.2046244202,
"stone": 0.1574731728,
"ounce": 35.273990723,
"carrat": 5000,
"atomic-mass-unit": 6.022136652e26,
}
WEIGHT_TYPE_CHART: dict[str, float] = {
"kilogram": 1,
"gram": pow(10, -3),
"milligram": pow(10, -6),
"metric-ton": pow(10, 3),
"long-ton": 1016.04608,
"short-ton": 907.184,
"pound": 0.453592,
"stone": 6.35029,
"ounce": 0.0283495,
"carrat": 0.0002,
"atomic-mass-unit": 1.660540199e-27,
}
def weight_conversion(from_type: str, to_type: str, value: float) -> float:
"""
Conversion of weight unit with the help of KILOGRAM_CHART
"kilogram" : 1,
"gram" : pow(10, 3),
"milligram" : pow(10, 6),
"metric-ton" : pow(10, -3),
"long-ton" : 0.0009842073,
"short-ton" : 0.0011023122,
"pound" : 2.2046244202,
"stone": 0.1574731728,
"ounce" : 35.273990723,
"carrat" : 5000,
"atomic-mass-unit" : 6.022136652E+26
>>> weight_conversion("kilogram","kilogram",4)
4
>>> weight_conversion("kilogram","gram",1)
1000
>>> weight_conversion("kilogram","milligram",4)
4000000
>>> weight_conversion("kilogram","metric-ton",4)
0.004
>>> weight_conversion("kilogram","long-ton",3)
0.0029526219
>>> weight_conversion("kilogram","short-ton",1)
0.0011023122
>>> weight_conversion("kilogram","pound",4)
8.8184976808
>>> weight_conversion("kilogram","stone",5)
0.7873658640000001
>>> weight_conversion("kilogram","ounce",4)
141.095962892
>>> weight_conversion("kilogram","carrat",3)
15000
>>> weight_conversion("kilogram","atomic-mass-unit",1)
6.022136652e+26
>>> weight_conversion("gram","kilogram",1)
0.001
>>> weight_conversion("gram","gram",3)
3.0
>>> weight_conversion("gram","milligram",2)
2000.0
>>> weight_conversion("gram","metric-ton",4)
4e-06
>>> weight_conversion("gram","long-ton",3)
2.9526219e-06
>>> weight_conversion("gram","short-ton",3)
3.3069366000000003e-06
>>> weight_conversion("gram","pound",3)
0.0066138732606
>>> weight_conversion("gram","stone",4)
0.0006298926912000001
>>> weight_conversion("gram","ounce",1)
0.035273990723
>>> weight_conversion("gram","carrat",2)
10.0
>>> weight_conversion("gram","atomic-mass-unit",1)
6.022136652e+23
>>> weight_conversion("milligram","kilogram",1)
1e-06
>>> weight_conversion("milligram","gram",2)
0.002
>>> weight_conversion("milligram","milligram",3)
3.0
>>> weight_conversion("milligram","metric-ton",3)
3e-09
>>> weight_conversion("milligram","long-ton",3)
2.9526219e-09
>>> weight_conversion("milligram","short-ton",1)
1.1023122e-09
>>> weight_conversion("milligram","pound",3)
6.6138732605999995e-06
>>> weight_conversion("milligram","ounce",2)
7.054798144599999e-05
>>> weight_conversion("milligram","carrat",1)
0.005
>>> weight_conversion("milligram","atomic-mass-unit",1)
6.022136652e+20
>>> weight_conversion("metric-ton","kilogram",2)
2000
>>> weight_conversion("metric-ton","gram",2)
2000000
>>> weight_conversion("metric-ton","milligram",3)
3000000000
>>> weight_conversion("metric-ton","metric-ton",2)
2.0
>>> weight_conversion("metric-ton","long-ton",3)
2.9526219
>>> weight_conversion("metric-ton","short-ton",2)
2.2046244
>>> weight_conversion("metric-ton","pound",3)
6613.8732606
>>> weight_conversion("metric-ton","ounce",4)
141095.96289199998
>>> weight_conversion("metric-ton","carrat",4)
20000000
>>> weight_conversion("metric-ton","atomic-mass-unit",1)
6.022136652e+29
>>> weight_conversion("long-ton","kilogram",4)
4064.18432
>>> weight_conversion("long-ton","gram",4)
4064184.32
>>> weight_conversion("long-ton","milligram",3)
3048138240.0
>>> weight_conversion("long-ton","metric-ton",4)
4.06418432
>>> weight_conversion("long-ton","long-ton",3)
2.999999907217152
>>> weight_conversion("long-ton","short-ton",1)
1.119999989746176
>>> weight_conversion("long-ton","pound",3)
6720.000000049448
>>> weight_conversion("long-ton","ounce",1)
35840.000000060514
>>> weight_conversion("long-ton","carrat",4)
20320921.599999998
>>> weight_conversion("long-ton","atomic-mass-unit",4)
2.4475073353955697e+30
>>> weight_conversion("short-ton","kilogram",3)
2721.5519999999997
>>> weight_conversion("short-ton","gram",3)
2721552.0
>>> weight_conversion("short-ton","milligram",1)
907184000.0
>>> weight_conversion("short-ton","metric-ton",4)
3.628736
>>> weight_conversion("short-ton","long-ton",3)
2.6785713457296
>>> weight_conversion("short-ton","short-ton",3)
2.9999999725344
>>> weight_conversion("short-ton","pound",2)
4000.0000000294335
>>> weight_conversion("short-ton","ounce",4)
128000.00000021611
>>> weight_conversion("short-ton","carrat",4)
18143680.0
>>> weight_conversion("short-ton","atomic-mass-unit",1)
5.463186016507968e+29
>>> weight_conversion("pound","kilogram",4)
1.814368
>>> weight_conversion("pound","gram",2)
907.184
>>> weight_conversion("pound","milligram",3)
1360776.0
>>> weight_conversion("pound","metric-ton",3)
0.001360776
>>> weight_conversion("pound","long-ton",2)
0.0008928571152432
>>> weight_conversion("pound","short-ton",1)
0.0004999999954224
>>> weight_conversion("pound","pound",3)
3.0000000000220752
>>> weight_conversion("pound","ounce",1)
16.000000000027015
>>> weight_conversion("pound","carrat",1)
2267.96
>>> weight_conversion("pound","atomic-mass-unit",4)
1.0926372033015936e+27
>>> weight_conversion("stone","kilogram",5)
31.751450000000002
>>> weight_conversion("stone","gram",2)
12700.58
>>> weight_conversion("stone","milligram",3)
19050870.0
>>> weight_conversion("stone","metric-ton",3)
0.01905087
>>> weight_conversion("stone","long-ton",3)
0.018750005325351003
>>> weight_conversion("stone","short-ton",3)
0.021000006421614002
>>> weight_conversion("stone","pound",2)
28.00000881870372
>>> weight_conversion("stone","ounce",1)
224.00007054835967
>>> weight_conversion("stone","carrat",2)
63502.9
>>> weight_conversion("ounce","kilogram",3)
0.0850485
>>> weight_conversion("ounce","gram",3)
85.0485
>>> weight_conversion("ounce","milligram",4)
113398.0
>>> weight_conversion("ounce","metric-ton",4)
0.000113398
>>> weight_conversion("ounce","long-ton",4)
0.0001116071394054
>>> weight_conversion("ounce","short-ton",4)
0.0001249999988556
>>> weight_conversion("ounce","pound",1)
0.0625000000004599
>>> weight_conversion("ounce","ounce",2)
2.000000000003377
>>> weight_conversion("ounce","carrat",1)
141.7475
>>> weight_conversion("ounce","atomic-mass-unit",1)
1.70724563015874e+25
>>> weight_conversion("carrat","kilogram",1)
0.0002
>>> weight_conversion("carrat","gram",4)
0.8
>>> weight_conversion("carrat","milligram",2)
400.0
>>> weight_conversion("carrat","metric-ton",2)
4.0000000000000003e-07
>>> weight_conversion("carrat","long-ton",3)
5.9052438e-07
>>> weight_conversion("carrat","short-ton",4)
8.818497600000002e-07
>>> weight_conversion("carrat","pound",1)
0.00044092488404000004
>>> weight_conversion("carrat","ounce",2)
0.0141095962892
>>> weight_conversion("carrat","carrat",4)
4.0
>>> weight_conversion("carrat","atomic-mass-unit",4)
4.8177093216e+23
>>> weight_conversion("atomic-mass-unit","kilogram",4)
6.642160796e-27
>>> weight_conversion("atomic-mass-unit","gram",2)
3.321080398e-24
>>> weight_conversion("atomic-mass-unit","milligram",2)
3.3210803980000002e-21
>>> weight_conversion("atomic-mass-unit","metric-ton",3)
4.9816205970000004e-30
>>> weight_conversion("atomic-mass-unit","long-ton",3)
4.9029473573977584e-30
>>> weight_conversion("atomic-mass-unit","short-ton",1)
1.830433719948128e-30
>>> weight_conversion("atomic-mass-unit","pound",3)
1.0982602420317504e-26
>>> weight_conversion("atomic-mass-unit","ounce",2)
1.1714775914938915e-25
>>> weight_conversion("atomic-mass-unit","carrat",2)
1.660540199e-23
>>> weight_conversion("atomic-mass-unit","atomic-mass-unit",2)
1.999999998903455
"""
if to_type not in KILOGRAM_CHART or from_type not in WEIGHT_TYPE_CHART:
raise ValueError(
f"Invalid 'from_type' or 'to_type' value: {from_type!r}, {to_type!r}\n"
f"Supported values are: {', '.join(WEIGHT_TYPE_CHART)}"
)
return value * KILOGRAM_CHART[to_type] * WEIGHT_TYPE_CHART[from_type]
if __name__ == "__main__":
import doctest
doctest.testmod()
| """
Conversion of weight units.
__author__ = "Anubhav Solanki"
__license__ = "MIT"
__version__ = "1.1.0"
__maintainer__ = "Anubhav Solanki"
__email__ = "[email protected]"
USAGE :
-> Import this file into their respective project.
-> Use the function weight_conversion() for conversion of weight units.
-> Parameters :
-> from_type : From which type you want to convert
-> to_type : To which type you want to convert
-> value : the value which you want to convert
REFERENCES :
-> Wikipedia reference: https://en.wikipedia.org/wiki/Kilogram
-> Wikipedia reference: https://en.wikipedia.org/wiki/Gram
-> Wikipedia reference: https://en.wikipedia.org/wiki/Millimetre
-> Wikipedia reference: https://en.wikipedia.org/wiki/Tonne
-> Wikipedia reference: https://en.wikipedia.org/wiki/Long_ton
-> Wikipedia reference: https://en.wikipedia.org/wiki/Short_ton
-> Wikipedia reference: https://en.wikipedia.org/wiki/Pound
-> Wikipedia reference: https://en.wikipedia.org/wiki/Ounce
-> Wikipedia reference: https://en.wikipedia.org/wiki/Fineness#Karat
-> Wikipedia reference: https://en.wikipedia.org/wiki/Dalton_(unit)
-> Wikipedia reference: https://en.wikipedia.org/wiki/Stone_(unit)
"""
KILOGRAM_CHART: dict[str, float] = {
"kilogram": 1,
"gram": pow(10, 3),
"milligram": pow(10, 6),
"metric-ton": pow(10, -3),
"long-ton": 0.0009842073,
"short-ton": 0.0011023122,
"pound": 2.2046244202,
"stone": 0.1574731728,
"ounce": 35.273990723,
"carrat": 5000,
"atomic-mass-unit": 6.022136652e26,
}
WEIGHT_TYPE_CHART: dict[str, float] = {
"kilogram": 1,
"gram": pow(10, -3),
"milligram": pow(10, -6),
"metric-ton": pow(10, 3),
"long-ton": 1016.04608,
"short-ton": 907.184,
"pound": 0.453592,
"stone": 6.35029,
"ounce": 0.0283495,
"carrat": 0.0002,
"atomic-mass-unit": 1.660540199e-27,
}
def weight_conversion(from_type: str, to_type: str, value: float) -> float:
"""
Conversion of weight unit with the help of KILOGRAM_CHART
"kilogram" : 1,
"gram" : pow(10, 3),
"milligram" : pow(10, 6),
"metric-ton" : pow(10, -3),
"long-ton" : 0.0009842073,
"short-ton" : 0.0011023122,
"pound" : 2.2046244202,
"stone": 0.1574731728,
"ounce" : 35.273990723,
"carrat" : 5000,
"atomic-mass-unit" : 6.022136652E+26
>>> weight_conversion("kilogram","kilogram",4)
4
>>> weight_conversion("kilogram","gram",1)
1000
>>> weight_conversion("kilogram","milligram",4)
4000000
>>> weight_conversion("kilogram","metric-ton",4)
0.004
>>> weight_conversion("kilogram","long-ton",3)
0.0029526219
>>> weight_conversion("kilogram","short-ton",1)
0.0011023122
>>> weight_conversion("kilogram","pound",4)
8.8184976808
>>> weight_conversion("kilogram","stone",5)
0.7873658640000001
>>> weight_conversion("kilogram","ounce",4)
141.095962892
>>> weight_conversion("kilogram","carrat",3)
15000
>>> weight_conversion("kilogram","atomic-mass-unit",1)
6.022136652e+26
>>> weight_conversion("gram","kilogram",1)
0.001
>>> weight_conversion("gram","gram",3)
3.0
>>> weight_conversion("gram","milligram",2)
2000.0
>>> weight_conversion("gram","metric-ton",4)
4e-06
>>> weight_conversion("gram","long-ton",3)
2.9526219e-06
>>> weight_conversion("gram","short-ton",3)
3.3069366000000003e-06
>>> weight_conversion("gram","pound",3)
0.0066138732606
>>> weight_conversion("gram","stone",4)
0.0006298926912000001
>>> weight_conversion("gram","ounce",1)
0.035273990723
>>> weight_conversion("gram","carrat",2)
10.0
>>> weight_conversion("gram","atomic-mass-unit",1)
6.022136652e+23
>>> weight_conversion("milligram","kilogram",1)
1e-06
>>> weight_conversion("milligram","gram",2)
0.002
>>> weight_conversion("milligram","milligram",3)
3.0
>>> weight_conversion("milligram","metric-ton",3)
3e-09
>>> weight_conversion("milligram","long-ton",3)
2.9526219e-09
>>> weight_conversion("milligram","short-ton",1)
1.1023122e-09
>>> weight_conversion("milligram","pound",3)
6.6138732605999995e-06
>>> weight_conversion("milligram","ounce",2)
7.054798144599999e-05
>>> weight_conversion("milligram","carrat",1)
0.005
>>> weight_conversion("milligram","atomic-mass-unit",1)
6.022136652e+20
>>> weight_conversion("metric-ton","kilogram",2)
2000
>>> weight_conversion("metric-ton","gram",2)
2000000
>>> weight_conversion("metric-ton","milligram",3)
3000000000
>>> weight_conversion("metric-ton","metric-ton",2)
2.0
>>> weight_conversion("metric-ton","long-ton",3)
2.9526219
>>> weight_conversion("metric-ton","short-ton",2)
2.2046244
>>> weight_conversion("metric-ton","pound",3)
6613.8732606
>>> weight_conversion("metric-ton","ounce",4)
141095.96289199998
>>> weight_conversion("metric-ton","carrat",4)
20000000
>>> weight_conversion("metric-ton","atomic-mass-unit",1)
6.022136652e+29
>>> weight_conversion("long-ton","kilogram",4)
4064.18432
>>> weight_conversion("long-ton","gram",4)
4064184.32
>>> weight_conversion("long-ton","milligram",3)
3048138240.0
>>> weight_conversion("long-ton","metric-ton",4)
4.06418432
>>> weight_conversion("long-ton","long-ton",3)
2.999999907217152
>>> weight_conversion("long-ton","short-ton",1)
1.119999989746176
>>> weight_conversion("long-ton","pound",3)
6720.000000049448
>>> weight_conversion("long-ton","ounce",1)
35840.000000060514
>>> weight_conversion("long-ton","carrat",4)
20320921.599999998
>>> weight_conversion("long-ton","atomic-mass-unit",4)
2.4475073353955697e+30
>>> weight_conversion("short-ton","kilogram",3)
2721.5519999999997
>>> weight_conversion("short-ton","gram",3)
2721552.0
>>> weight_conversion("short-ton","milligram",1)
907184000.0
>>> weight_conversion("short-ton","metric-ton",4)
3.628736
>>> weight_conversion("short-ton","long-ton",3)
2.6785713457296
>>> weight_conversion("short-ton","short-ton",3)
2.9999999725344
>>> weight_conversion("short-ton","pound",2)
4000.0000000294335
>>> weight_conversion("short-ton","ounce",4)
128000.00000021611
>>> weight_conversion("short-ton","carrat",4)
18143680.0
>>> weight_conversion("short-ton","atomic-mass-unit",1)
5.463186016507968e+29
>>> weight_conversion("pound","kilogram",4)
1.814368
>>> weight_conversion("pound","gram",2)
907.184
>>> weight_conversion("pound","milligram",3)
1360776.0
>>> weight_conversion("pound","metric-ton",3)
0.001360776
>>> weight_conversion("pound","long-ton",2)
0.0008928571152432
>>> weight_conversion("pound","short-ton",1)
0.0004999999954224
>>> weight_conversion("pound","pound",3)
3.0000000000220752
>>> weight_conversion("pound","ounce",1)
16.000000000027015
>>> weight_conversion("pound","carrat",1)
2267.96
>>> weight_conversion("pound","atomic-mass-unit",4)
1.0926372033015936e+27
>>> weight_conversion("stone","kilogram",5)
31.751450000000002
>>> weight_conversion("stone","gram",2)
12700.58
>>> weight_conversion("stone","milligram",3)
19050870.0
>>> weight_conversion("stone","metric-ton",3)
0.01905087
>>> weight_conversion("stone","long-ton",3)
0.018750005325351003
>>> weight_conversion("stone","short-ton",3)
0.021000006421614002
>>> weight_conversion("stone","pound",2)
28.00000881870372
>>> weight_conversion("stone","ounce",1)
224.00007054835967
>>> weight_conversion("stone","carrat",2)
63502.9
>>> weight_conversion("ounce","kilogram",3)
0.0850485
>>> weight_conversion("ounce","gram",3)
85.0485
>>> weight_conversion("ounce","milligram",4)
113398.0
>>> weight_conversion("ounce","metric-ton",4)
0.000113398
>>> weight_conversion("ounce","long-ton",4)
0.0001116071394054
>>> weight_conversion("ounce","short-ton",4)
0.0001249999988556
>>> weight_conversion("ounce","pound",1)
0.0625000000004599
>>> weight_conversion("ounce","ounce",2)
2.000000000003377
>>> weight_conversion("ounce","carrat",1)
141.7475
>>> weight_conversion("ounce","atomic-mass-unit",1)
1.70724563015874e+25
>>> weight_conversion("carrat","kilogram",1)
0.0002
>>> weight_conversion("carrat","gram",4)
0.8
>>> weight_conversion("carrat","milligram",2)
400.0
>>> weight_conversion("carrat","metric-ton",2)
4.0000000000000003e-07
>>> weight_conversion("carrat","long-ton",3)
5.9052438e-07
>>> weight_conversion("carrat","short-ton",4)
8.818497600000002e-07
>>> weight_conversion("carrat","pound",1)
0.00044092488404000004
>>> weight_conversion("carrat","ounce",2)
0.0141095962892
>>> weight_conversion("carrat","carrat",4)
4.0
>>> weight_conversion("carrat","atomic-mass-unit",4)
4.8177093216e+23
>>> weight_conversion("atomic-mass-unit","kilogram",4)
6.642160796e-27
>>> weight_conversion("atomic-mass-unit","gram",2)
3.321080398e-24
>>> weight_conversion("atomic-mass-unit","milligram",2)
3.3210803980000002e-21
>>> weight_conversion("atomic-mass-unit","metric-ton",3)
4.9816205970000004e-30
>>> weight_conversion("atomic-mass-unit","long-ton",3)
4.9029473573977584e-30
>>> weight_conversion("atomic-mass-unit","short-ton",1)
1.830433719948128e-30
>>> weight_conversion("atomic-mass-unit","pound",3)
1.0982602420317504e-26
>>> weight_conversion("atomic-mass-unit","ounce",2)
1.1714775914938915e-25
>>> weight_conversion("atomic-mass-unit","carrat",2)
1.660540199e-23
>>> weight_conversion("atomic-mass-unit","atomic-mass-unit",2)
1.999999998903455
"""
if to_type not in KILOGRAM_CHART or from_type not in WEIGHT_TYPE_CHART:
raise ValueError(
f"Invalid 'from_type' or 'to_type' value: {from_type!r}, {to_type!r}\n"
f"Supported values are: {', '.join(WEIGHT_TYPE_CHART)}"
)
return value * KILOGRAM_CHART[to_type] * WEIGHT_TYPE_CHART[from_type]
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
https://en.wikipedia.org/wiki/Component_(graph_theory)
Finding connected components in graph
"""
test_graph_1 = {0: [1, 2], 1: [0, 3], 2: [0], 3: [1], 4: [5, 6], 5: [4, 6], 6: [4, 5]}
test_graph_2 = {0: [1, 2, 3], 1: [0, 3], 2: [0], 3: [0, 1], 4: [], 5: []}
def dfs(graph: dict, vert: int, visited: list) -> list:
"""
Use depth first search to find all vertices
being in the same component as initial vertex
>>> dfs(test_graph_1, 0, 5 * [False])
[0, 1, 3, 2]
>>> dfs(test_graph_2, 0, 6 * [False])
[0, 1, 3, 2]
"""
visited[vert] = True
connected_verts = []
for neighbour in graph[vert]:
if not visited[neighbour]:
connected_verts += dfs(graph, neighbour, visited)
return [vert] + connected_verts
def connected_components(graph: dict) -> list:
"""
This function takes graph as a parameter
and then returns the list of connected components
>>> connected_components(test_graph_1)
[[0, 1, 3, 2], [4, 5, 6]]
>>> connected_components(test_graph_2)
[[0, 1, 3, 2], [4], [5]]
"""
graph_size = len(graph)
visited = graph_size * [False]
components_list = []
for i in range(graph_size):
if not visited[i]:
i_connected = dfs(graph, i, visited)
components_list.append(i_connected)
return components_list
if __name__ == "__main__":
import doctest
doctest.testmod()
| """
https://en.wikipedia.org/wiki/Component_(graph_theory)
Finding connected components in graph
"""
test_graph_1 = {0: [1, 2], 1: [0, 3], 2: [0], 3: [1], 4: [5, 6], 5: [4, 6], 6: [4, 5]}
test_graph_2 = {0: [1, 2, 3], 1: [0, 3], 2: [0], 3: [0, 1], 4: [], 5: []}
def dfs(graph: dict, vert: int, visited: list) -> list:
"""
Use depth first search to find all vertices
being in the same component as initial vertex
>>> dfs(test_graph_1, 0, 5 * [False])
[0, 1, 3, 2]
>>> dfs(test_graph_2, 0, 6 * [False])
[0, 1, 3, 2]
"""
visited[vert] = True
connected_verts = []
for neighbour in graph[vert]:
if not visited[neighbour]:
connected_verts += dfs(graph, neighbour, visited)
return [vert] + connected_verts
def connected_components(graph: dict) -> list:
"""
This function takes graph as a parameter
and then returns the list of connected components
>>> connected_components(test_graph_1)
[[0, 1, 3, 2], [4, 5, 6]]
>>> connected_components(test_graph_2)
[[0, 1, 3, 2], [4], [5]]
"""
graph_size = len(graph)
visited = graph_size * [False]
components_list = []
for i in range(graph_size):
if not visited[i]:
i_connected = dfs(graph, i, visited)
components_list.append(i_connected)
return components_list
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # flake8: noqa
"""The following implementation assumes that the activities
are already sorted according to their finish time"""
"""Prints a maximum set of activities that can be done by a
single person, one at a time"""
# n --> Total number of activities
# start[]--> An array that contains start time of all activities
# finish[] --> An array that contains finish time of all activities
def printMaxActivities(start: list[int], finish: list[int]) -> None:
"""
>>> start = [1, 3, 0, 5, 8, 5]
>>> finish = [2, 4, 6, 7, 9, 9]
>>> printMaxActivities(start, finish)
The following activities are selected:
0,1,3,4,
"""
n = len(finish)
print("The following activities are selected:")
# The first activity is always selected
i = 0
print(i, end=",")
# Consider rest of the activities
for j in range(n):
# If this activity has start time greater than
# or equal to the finish time of previously
# selected activity, then select it
if start[j] >= finish[i]:
print(j, end=",")
i = j
if __name__ == "__main__":
import doctest
doctest.testmod()
start = [1, 3, 0, 5, 8, 5]
finish = [2, 4, 6, 7, 9, 9]
printMaxActivities(start, finish)
| # flake8: noqa
"""The following implementation assumes that the activities
are already sorted according to their finish time"""
"""Prints a maximum set of activities that can be done by a
single person, one at a time"""
# n --> Total number of activities
# start[]--> An array that contains start time of all activities
# finish[] --> An array that contains finish time of all activities
def printMaxActivities(start: list[int], finish: list[int]) -> None:
"""
>>> start = [1, 3, 0, 5, 8, 5]
>>> finish = [2, 4, 6, 7, 9, 9]
>>> printMaxActivities(start, finish)
The following activities are selected:
0,1,3,4,
"""
n = len(finish)
print("The following activities are selected:")
# The first activity is always selected
i = 0
print(i, end=",")
# Consider rest of the activities
for j in range(n):
# If this activity has start time greater than
# or equal to the finish time of previously
# selected activity, then select it
if start[j] >= finish[i]:
print(j, end=",")
i = j
if __name__ == "__main__":
import doctest
doctest.testmod()
start = [1, 3, 0, 5, 8, 5]
finish = [2, 4, 6, 7, 9, 9]
printMaxActivities(start, finish)
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Project Euler Problem 5: https://projecteuler.net/problem=5
Smallest multiple
2520 is the smallest number that can be divided by each of the numbers
from 1 to 10 without any remainder.
What is the smallest positive number that is _evenly divisible_ by all
of the numbers from 1 to 20?
References:
- https://en.wiktionary.org/wiki/evenly_divisible
- https://en.wikipedia.org/wiki/Euclidean_algorithm
- https://en.wikipedia.org/wiki/Least_common_multiple
"""
def gcd(x: int, y: int) -> int:
"""
Euclidean GCD algorithm (Greatest Common Divisor)
>>> gcd(0, 0)
0
>>> gcd(23, 42)
1
>>> gcd(15, 33)
3
>>> gcd(12345, 67890)
15
"""
return x if y == 0 else gcd(y, x % y)
def lcm(x: int, y: int) -> int:
"""
Least Common Multiple.
Using the property that lcm(a, b) * gcd(a, b) = a*b
>>> lcm(3, 15)
15
>>> lcm(1, 27)
27
>>> lcm(13, 27)
351
>>> lcm(64, 48)
192
"""
return (x * y) // gcd(x, y)
def solution(n: int = 20) -> int:
"""
Returns the smallest positive number that is evenly divisible (divisible
with no remainder) by all of the numbers from 1 to n.
>>> solution(10)
2520
>>> solution(15)
360360
>>> solution(22)
232792560
"""
g = 1
for i in range(1, n + 1):
g = lcm(g, i)
return g
if __name__ == "__main__":
print(f"{solution() = }")
| """
Project Euler Problem 5: https://projecteuler.net/problem=5
Smallest multiple
2520 is the smallest number that can be divided by each of the numbers
from 1 to 10 without any remainder.
What is the smallest positive number that is _evenly divisible_ by all
of the numbers from 1 to 20?
References:
- https://en.wiktionary.org/wiki/evenly_divisible
- https://en.wikipedia.org/wiki/Euclidean_algorithm
- https://en.wikipedia.org/wiki/Least_common_multiple
"""
def gcd(x: int, y: int) -> int:
"""
Euclidean GCD algorithm (Greatest Common Divisor)
>>> gcd(0, 0)
0
>>> gcd(23, 42)
1
>>> gcd(15, 33)
3
>>> gcd(12345, 67890)
15
"""
return x if y == 0 else gcd(y, x % y)
def lcm(x: int, y: int) -> int:
"""
Least Common Multiple.
Using the property that lcm(a, b) * gcd(a, b) = a*b
>>> lcm(3, 15)
15
>>> lcm(1, 27)
27
>>> lcm(13, 27)
351
>>> lcm(64, 48)
192
"""
return (x * y) // gcd(x, y)
def solution(n: int = 20) -> int:
"""
Returns the smallest positive number that is evenly divisible (divisible
with no remainder) by all of the numbers from 1 to n.
>>> solution(10)
2520
>>> solution(15)
360360
>>> solution(22)
232792560
"""
g = 1
for i in range(1, n + 1):
g = lcm(g, i)
return g
if __name__ == "__main__":
print(f"{solution() = }")
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
||
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Eulerian Path is a path in graph that visits every edge exactly once.
# Eulerian Circuit is an Eulerian Path which starts and ends on the same
# vertex.
# time complexity is O(V+E)
# space complexity is O(VE)
# using dfs for finding eulerian path traversal
def dfs(u, graph, visited_edge, path=None):
path = (path or []) + [u]
for v in graph[u]:
if visited_edge[u][v] is False:
visited_edge[u][v], visited_edge[v][u] = True, True
path = dfs(v, graph, visited_edge, path)
return path
# for checking in graph has euler path or circuit
def check_circuit_or_path(graph, max_node):
odd_degree_nodes = 0
odd_node = -1
for i in range(max_node):
if i not in graph.keys():
continue
if len(graph[i]) % 2 == 1:
odd_degree_nodes += 1
odd_node = i
if odd_degree_nodes == 0:
return 1, odd_node
if odd_degree_nodes == 2:
return 2, odd_node
return 3, odd_node
def check_euler(graph, max_node):
visited_edge = [[False for _ in range(max_node + 1)] for _ in range(max_node + 1)]
check, odd_node = check_circuit_or_path(graph, max_node)
if check == 3:
print("graph is not Eulerian")
print("no path")
return
start_node = 1
if check == 2:
start_node = odd_node
print("graph has a Euler path")
if check == 1:
print("graph has a Euler cycle")
path = dfs(start_node, graph, visited_edge)
print(path)
def main():
G1 = {1: [2, 3, 4], 2: [1, 3], 3: [1, 2], 4: [1, 5], 5: [4]}
G2 = {1: [2, 3, 4, 5], 2: [1, 3], 3: [1, 2], 4: [1, 5], 5: [1, 4]}
G3 = {1: [2, 3, 4], 2: [1, 3, 4], 3: [1, 2], 4: [1, 2, 5], 5: [4]}
G4 = {1: [2, 3], 2: [1, 3], 3: [1, 2]}
G5 = {
1: [],
2: []
# all degree is zero
}
max_node = 10
check_euler(G1, max_node)
check_euler(G2, max_node)
check_euler(G3, max_node)
check_euler(G4, max_node)
check_euler(G5, max_node)
if __name__ == "__main__":
main()
| # Eulerian Path is a path in graph that visits every edge exactly once.
# Eulerian Circuit is an Eulerian Path which starts and ends on the same
# vertex.
# time complexity is O(V+E)
# space complexity is O(VE)
# using dfs for finding eulerian path traversal
def dfs(u, graph, visited_edge, path=None):
path = (path or []) + [u]
for v in graph[u]:
if visited_edge[u][v] is False:
visited_edge[u][v], visited_edge[v][u] = True, True
path = dfs(v, graph, visited_edge, path)
return path
# for checking in graph has euler path or circuit
def check_circuit_or_path(graph, max_node):
odd_degree_nodes = 0
odd_node = -1
for i in range(max_node):
if i not in graph.keys():
continue
if len(graph[i]) % 2 == 1:
odd_degree_nodes += 1
odd_node = i
if odd_degree_nodes == 0:
return 1, odd_node
if odd_degree_nodes == 2:
return 2, odd_node
return 3, odd_node
def check_euler(graph, max_node):
visited_edge = [[False for _ in range(max_node + 1)] for _ in range(max_node + 1)]
check, odd_node = check_circuit_or_path(graph, max_node)
if check == 3:
print("graph is not Eulerian")
print("no path")
return
start_node = 1
if check == 2:
start_node = odd_node
print("graph has a Euler path")
if check == 1:
print("graph has a Euler cycle")
path = dfs(start_node, graph, visited_edge)
print(path)
def main():
G1 = {1: [2, 3, 4], 2: [1, 3], 3: [1, 2], 4: [1, 5], 5: [4]}
G2 = {1: [2, 3, 4, 5], 2: [1, 3], 3: [1, 2], 4: [1, 5], 5: [1, 4]}
G3 = {1: [2, 3, 4], 2: [1, 3, 4], 3: [1, 2], 4: [1, 2, 5], 5: [4]}
G4 = {1: [2, 3], 2: [1, 3], 3: [1, 2]}
G5 = {
1: [],
2: []
# all degree is zero
}
max_node = 10
check_euler(G1, max_node)
check_euler(G2, max_node)
check_euler(G3, max_node)
check_euler(G4, max_node)
check_euler(G5, max_node)
if __name__ == "__main__":
main()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
== Krishnamurthy Number ==
It is also known as Peterson Number
A Krishnamurthy Number is a number whose sum of the
factorial of the digits equals to the original
number itself.
For example: 145 = 1! + 4! + 5!
So, 145 is a Krishnamurthy Number
"""
def factorial(digit: int) -> int:
"""
>>> factorial(3)
6
>>> factorial(0)
1
>>> factorial(5)
120
"""
return 1 if digit in (0, 1) else (digit * factorial(digit - 1))
def krishnamurthy(number: int) -> bool:
"""
>>> krishnamurthy(145)
True
>>> krishnamurthy(240)
False
>>> krishnamurthy(1)
True
"""
factSum = 0
duplicate = number
while duplicate > 0:
duplicate, digit = divmod(duplicate, 10)
factSum += factorial(digit)
return factSum == number
if __name__ == "__main__":
print("Program to check whether a number is a Krisnamurthy Number or not.")
number = int(input("Enter number: ").strip())
print(
f"{number} is {'' if krishnamurthy(number) else 'not '}a Krishnamurthy Number."
)
| """
== Krishnamurthy Number ==
It is also known as Peterson Number
A Krishnamurthy Number is a number whose sum of the
factorial of the digits equals to the original
number itself.
For example: 145 = 1! + 4! + 5!
So, 145 is a Krishnamurthy Number
"""
def factorial(digit: int) -> int:
"""
>>> factorial(3)
6
>>> factorial(0)
1
>>> factorial(5)
120
"""
return 1 if digit in (0, 1) else (digit * factorial(digit - 1))
def krishnamurthy(number: int) -> bool:
"""
>>> krishnamurthy(145)
True
>>> krishnamurthy(240)
False
>>> krishnamurthy(1)
True
"""
factSum = 0
duplicate = number
while duplicate > 0:
duplicate, digit = divmod(duplicate, 10)
factSum += factorial(digit)
return factSum == number
if __name__ == "__main__":
print("Program to check whether a number is a Krisnamurthy Number or not.")
number = int(input("Enter number: ").strip())
print(
f"{number} is {'' if krishnamurthy(number) else 'not '}a Krishnamurthy Number."
)
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
||
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
developed by: markmelnic
original repo: https://github.com/markmelnic/Scoring-Algorithm
Analyse data using a range based percentual proximity algorithm
and calculate the linear maximum likelihood estimation.
The basic principle is that all values supplied will be broken
down to a range from 0 to 1 and each column's score will be added
up to get the total score.
==========
Example for data of vehicles
price|mileage|registration_year
20k |60k |2012
22k |50k |2011
23k |90k |2015
16k |210k |2010
We want the vehicle with the lowest price,
lowest mileage but newest registration year.
Thus the weights for each column are as follows:
[0, 0, 1]
"""
def procentual_proximity(
source_data: list[list[float]], weights: list[int]
) -> list[list[float]]:
"""
weights - int list
possible values - 0 / 1
0 if lower values have higher weight in the data set
1 if higher values have higher weight in the data set
>>> procentual_proximity([[20, 60, 2012],[23, 90, 2015],[22, 50, 2011]], [0, 0, 1])
[[20, 60, 2012, 2.0], [23, 90, 2015, 1.0], [22, 50, 2011, 1.3333333333333335]]
"""
# getting data
data_lists: list[list[float]] = []
for data in source_data:
for i, el in enumerate(data):
if len(data_lists) < i + 1:
data_lists.append([])
data_lists[i].append(float(el))
score_lists: list[list[float]] = []
# calculating each score
for dlist, weight in zip(data_lists, weights):
mind = min(dlist)
maxd = max(dlist)
score: list[float] = []
# for weight 0 score is 1 - actual score
if weight == 0:
for item in dlist:
try:
score.append(1 - ((item - mind) / (maxd - mind)))
except ZeroDivisionError:
score.append(1)
elif weight == 1:
for item in dlist:
try:
score.append((item - mind) / (maxd - mind))
except ZeroDivisionError:
score.append(0)
# weight not 0 or 1
else:
raise ValueError(f"Invalid weight of {weight:f} provided")
score_lists.append(score)
# initialize final scores
final_scores: list[float] = [0 for i in range(len(score_lists[0]))]
# generate final scores
for i, slist in enumerate(score_lists):
for j, ele in enumerate(slist):
final_scores[j] = final_scores[j] + ele
# append scores to source data
for i, ele in enumerate(final_scores):
source_data[i].append(ele)
return source_data
| """
developed by: markmelnic
original repo: https://github.com/markmelnic/Scoring-Algorithm
Analyse data using a range based percentual proximity algorithm
and calculate the linear maximum likelihood estimation.
The basic principle is that all values supplied will be broken
down to a range from 0 to 1 and each column's score will be added
up to get the total score.
==========
Example for data of vehicles
price|mileage|registration_year
20k |60k |2012
22k |50k |2011
23k |90k |2015
16k |210k |2010
We want the vehicle with the lowest price,
lowest mileage but newest registration year.
Thus the weights for each column are as follows:
[0, 0, 1]
"""
def procentual_proximity(
source_data: list[list[float]], weights: list[int]
) -> list[list[float]]:
"""
weights - int list
possible values - 0 / 1
0 if lower values have higher weight in the data set
1 if higher values have higher weight in the data set
>>> procentual_proximity([[20, 60, 2012],[23, 90, 2015],[22, 50, 2011]], [0, 0, 1])
[[20, 60, 2012, 2.0], [23, 90, 2015, 1.0], [22, 50, 2011, 1.3333333333333335]]
"""
# getting data
data_lists: list[list[float]] = []
for data in source_data:
for i, el in enumerate(data):
if len(data_lists) < i + 1:
data_lists.append([])
data_lists[i].append(float(el))
score_lists: list[list[float]] = []
# calculating each score
for dlist, weight in zip(data_lists, weights):
mind = min(dlist)
maxd = max(dlist)
score: list[float] = []
# for weight 0 score is 1 - actual score
if weight == 0:
for item in dlist:
try:
score.append(1 - ((item - mind) / (maxd - mind)))
except ZeroDivisionError:
score.append(1)
elif weight == 1:
for item in dlist:
try:
score.append((item - mind) / (maxd - mind))
except ZeroDivisionError:
score.append(0)
# weight not 0 or 1
else:
raise ValueError(f"Invalid weight of {weight:f} provided")
score_lists.append(score)
# initialize final scores
final_scores: list[float] = [0 for i in range(len(score_lists[0]))]
# generate final scores
for i, slist in enumerate(score_lists):
for j, ele in enumerate(slist):
final_scores[j] = final_scores[j] + ele
# append scores to source data
for i, ele in enumerate(final_scores):
source_data[i].append(ele)
return source_data
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| def send_file(filename: str = "mytext.txt", testing: bool = False) -> None:
import socket
port = 12312 # Reserve a port for your service.
sock = socket.socket() # Create a socket object
host = socket.gethostname() # Get local machine name
sock.bind((host, port)) # Bind to the port
sock.listen(5) # Now wait for client connection.
print("Server listening....")
while True:
conn, addr = sock.accept() # Establish connection with client.
print(f"Got connection from {addr}")
data = conn.recv(1024)
print(f"Server received: {data = }")
with open(filename, "rb") as in_file:
data = in_file.read(1024)
while data:
conn.send(data)
print(f"Sent {data!r}")
data = in_file.read(1024)
print("Done sending")
conn.close()
if testing: # Allow the test to complete
break
sock.shutdown(1)
sock.close()
if __name__ == "__main__":
send_file()
| def send_file(filename: str = "mytext.txt", testing: bool = False) -> None:
import socket
port = 12312 # Reserve a port for your service.
sock = socket.socket() # Create a socket object
host = socket.gethostname() # Get local machine name
sock.bind((host, port)) # Bind to the port
sock.listen(5) # Now wait for client connection.
print("Server listening....")
while True:
conn, addr = sock.accept() # Establish connection with client.
print(f"Got connection from {addr}")
data = conn.recv(1024)
print(f"Server received: {data = }")
with open(filename, "rb") as in_file:
data = in_file.read(1024)
while data:
conn.send(data)
print(f"Sent {data!r}")
data = in_file.read(1024)
print("Done sending")
conn.close()
if testing: # Allow the test to complete
break
sock.shutdown(1)
sock.close()
if __name__ == "__main__":
send_file()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| from __future__ import annotations
import sys
class Letter:
def __init__(self, letter: str, freq: int):
self.letter: str = letter
self.freq: int = freq
self.bitstring: dict[str, str] = {}
def __repr__(self) -> str:
return f"{self.letter}:{self.freq}"
class TreeNode:
def __init__(self, freq: int, left: Letter | TreeNode, right: Letter | TreeNode):
self.freq: int = freq
self.left: Letter | TreeNode = left
self.right: Letter | TreeNode = right
def parse_file(file_path: str) -> list[Letter]:
"""
Read the file and build a dict of all letters and their
frequencies, then convert the dict into a list of Letters.
"""
chars: dict[str, int] = {}
with open(file_path) as f:
while True:
c = f.read(1)
if not c:
break
chars[c] = chars[c] + 1 if c in chars.keys() else 1
return sorted((Letter(c, f) for c, f in chars.items()), key=lambda l: l.freq)
def build_tree(letters: list[Letter]) -> Letter | TreeNode:
"""
Run through the list of Letters and build the min heap
for the Huffman Tree.
"""
response: list[Letter | TreeNode] = letters # type: ignore
while len(response) > 1:
left = response.pop(0)
right = response.pop(0)
total_freq = left.freq + right.freq
node = TreeNode(total_freq, left, right)
response.append(node)
response.sort(key=lambda l: l.freq)
return response[0]
def traverse_tree(root: Letter | TreeNode, bitstring: str) -> list[Letter]:
"""
Recursively traverse the Huffman Tree to set each
Letter's bitstring dictionary, and return the list of Letters
"""
if type(root) is Letter:
root.bitstring[root.letter] = bitstring
return [root]
treenode: TreeNode = root # type: ignore
letters = []
letters += traverse_tree(treenode.left, bitstring + "0")
letters += traverse_tree(treenode.right, bitstring + "1")
return letters
def huffman(file_path: str) -> None:
"""
Parse the file, build the tree, then run through the file
again, using the letters dictionary to find and print out the
bitstring for each letter.
"""
letters_list = parse_file(file_path)
root = build_tree(letters_list)
letters = {
k: v for letter in traverse_tree(root, "") for k, v in letter.bitstring.items()
}
print(f"Huffman Coding of {file_path}: ")
with open(file_path) as f:
while True:
c = f.read(1)
if not c:
break
print(letters[c], end=" ")
print()
if __name__ == "__main__":
# pass the file path to the huffman function
huffman(sys.argv[1])
| from __future__ import annotations
import sys
class Letter:
def __init__(self, letter: str, freq: int):
self.letter: str = letter
self.freq: int = freq
self.bitstring: dict[str, str] = {}
def __repr__(self) -> str:
return f"{self.letter}:{self.freq}"
class TreeNode:
def __init__(self, freq: int, left: Letter | TreeNode, right: Letter | TreeNode):
self.freq: int = freq
self.left: Letter | TreeNode = left
self.right: Letter | TreeNode = right
def parse_file(file_path: str) -> list[Letter]:
"""
Read the file and build a dict of all letters and their
frequencies, then convert the dict into a list of Letters.
"""
chars: dict[str, int] = {}
with open(file_path) as f:
while True:
c = f.read(1)
if not c:
break
chars[c] = chars[c] + 1 if c in chars.keys() else 1
return sorted((Letter(c, f) for c, f in chars.items()), key=lambda l: l.freq)
def build_tree(letters: list[Letter]) -> Letter | TreeNode:
"""
Run through the list of Letters and build the min heap
for the Huffman Tree.
"""
response: list[Letter | TreeNode] = letters # type: ignore
while len(response) > 1:
left = response.pop(0)
right = response.pop(0)
total_freq = left.freq + right.freq
node = TreeNode(total_freq, left, right)
response.append(node)
response.sort(key=lambda l: l.freq)
return response[0]
def traverse_tree(root: Letter | TreeNode, bitstring: str) -> list[Letter]:
"""
Recursively traverse the Huffman Tree to set each
Letter's bitstring dictionary, and return the list of Letters
"""
if type(root) is Letter:
root.bitstring[root.letter] = bitstring
return [root]
treenode: TreeNode = root # type: ignore
letters = []
letters += traverse_tree(treenode.left, bitstring + "0")
letters += traverse_tree(treenode.right, bitstring + "1")
return letters
def huffman(file_path: str) -> None:
"""
Parse the file, build the tree, then run through the file
again, using the letters dictionary to find and print out the
bitstring for each letter.
"""
letters_list = parse_file(file_path)
root = build_tree(letters_list)
letters = {
k: v for letter in traverse_tree(root, "") for k, v in letter.bitstring.items()
}
print(f"Huffman Coding of {file_path}: ")
with open(file_path) as f:
while True:
c = f.read(1)
if not c:
break
print(letters[c], end=" ")
print()
if __name__ == "__main__":
# pass the file path to the huffman function
huffman(sys.argv[1])
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
||
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
||
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
||
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Project Euler Problem 3: https://projecteuler.net/problem=3
Largest prime factor
The prime factors of 13195 are 5, 7, 13 and 29.
What is the largest prime factor of the number 600851475143?
References:
- https://en.wikipedia.org/wiki/Prime_number#Unique_factorization
"""
import math
def is_prime(num: int) -> bool:
"""
Returns boolean representing primality of given number num.
>>> is_prime(2)
True
>>> is_prime(3)
True
>>> is_prime(27)
False
>>> is_prime(2999)
True
>>> is_prime(0)
Traceback (most recent call last):
...
ValueError: Parameter num must be greater than or equal to two.
>>> is_prime(1)
Traceback (most recent call last):
...
ValueError: Parameter num must be greater than or equal to two.
"""
if num <= 1:
raise ValueError("Parameter num must be greater than or equal to two.")
if num == 2:
return True
elif num % 2 == 0:
return False
for i in range(3, int(math.sqrt(num)) + 1, 2):
if num % i == 0:
return False
return True
def solution(n: int = 600851475143) -> int:
"""
Returns the largest prime factor of a given number n.
>>> solution(13195)
29
>>> solution(10)
5
>>> solution(17)
17
>>> solution(3.4)
3
>>> solution(0)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution(-17)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution([])
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
>>> solution("asd")
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
"""
try:
n = int(n)
except (TypeError, ValueError):
raise TypeError("Parameter n must be int or castable to int.")
if n <= 0:
raise ValueError("Parameter n must be greater than or equal to one.")
max_number = 0
if is_prime(n):
return n
while n % 2 == 0:
n //= 2
if is_prime(n):
return n
for i in range(3, int(math.sqrt(n)) + 1, 2):
if n % i == 0:
if is_prime(n // i):
max_number = n // i
break
elif is_prime(i):
max_number = i
return max_number
if __name__ == "__main__":
print(f"{solution() = }")
| """
Project Euler Problem 3: https://projecteuler.net/problem=3
Largest prime factor
The prime factors of 13195 are 5, 7, 13 and 29.
What is the largest prime factor of the number 600851475143?
References:
- https://en.wikipedia.org/wiki/Prime_number#Unique_factorization
"""
import math
def is_prime(num: int) -> bool:
"""
Returns boolean representing primality of given number num.
>>> is_prime(2)
True
>>> is_prime(3)
True
>>> is_prime(27)
False
>>> is_prime(2999)
True
>>> is_prime(0)
Traceback (most recent call last):
...
ValueError: Parameter num must be greater than or equal to two.
>>> is_prime(1)
Traceback (most recent call last):
...
ValueError: Parameter num must be greater than or equal to two.
"""
if num <= 1:
raise ValueError("Parameter num must be greater than or equal to two.")
if num == 2:
return True
elif num % 2 == 0:
return False
for i in range(3, int(math.sqrt(num)) + 1, 2):
if num % i == 0:
return False
return True
def solution(n: int = 600851475143) -> int:
"""
Returns the largest prime factor of a given number n.
>>> solution(13195)
29
>>> solution(10)
5
>>> solution(17)
17
>>> solution(3.4)
3
>>> solution(0)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution(-17)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution([])
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
>>> solution("asd")
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
"""
try:
n = int(n)
except (TypeError, ValueError):
raise TypeError("Parameter n must be int or castable to int.")
if n <= 0:
raise ValueError("Parameter n must be greater than or equal to one.")
max_number = 0
if is_prime(n):
return n
while n % 2 == 0:
n //= 2
if is_prime(n):
return n
for i in range(3, int(math.sqrt(n)) + 1, 2):
if n % i == 0:
if is_prime(n // i):
max_number = n // i
break
elif is_prime(i):
max_number = i
return max_number
if __name__ == "__main__":
print(f"{solution() = }")
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| from __future__ import annotations
def dfs(u):
global graph, reversedGraph, scc, component, visit, stack
if visit[u]:
return
visit[u] = True
for v in graph[u]:
dfs(v)
stack.append(u)
def dfs2(u):
global graph, reversedGraph, scc, component, visit, stack
if visit[u]:
return
visit[u] = True
component.append(u)
for v in reversedGraph[u]:
dfs2(v)
def kosaraju():
global graph, reversedGraph, scc, component, visit, stack
for i in range(n):
dfs(i)
visit = [False] * n
for i in stack[::-1]:
if visit[i]:
continue
component = []
dfs2(i)
scc.append(component)
return scc
if __name__ == "__main__":
# n - no of nodes, m - no of edges
n, m = list(map(int, input().strip().split()))
graph: list[list[int]] = [[] for i in range(n)] # graph
reversedGraph: list[list[int]] = [[] for i in range(n)] # reversed graph
# input graph data (edges)
for i in range(m):
u, v = list(map(int, input().strip().split()))
graph[u].append(v)
reversedGraph[v].append(u)
stack: list[int] = []
visit: list[bool] = [False] * n
scc: list[int] = []
component: list[int] = []
print(kosaraju())
| from __future__ import annotations
def dfs(u):
global graph, reversedGraph, scc, component, visit, stack
if visit[u]:
return
visit[u] = True
for v in graph[u]:
dfs(v)
stack.append(u)
def dfs2(u):
global graph, reversedGraph, scc, component, visit, stack
if visit[u]:
return
visit[u] = True
component.append(u)
for v in reversedGraph[u]:
dfs2(v)
def kosaraju():
global graph, reversedGraph, scc, component, visit, stack
for i in range(n):
dfs(i)
visit = [False] * n
for i in stack[::-1]:
if visit[i]:
continue
component = []
dfs2(i)
scc.append(component)
return scc
if __name__ == "__main__":
# n - no of nodes, m - no of edges
n, m = list(map(int, input().strip().split()))
graph: list[list[int]] = [[] for i in range(n)] # graph
reversedGraph: list[list[int]] = [[] for i in range(n)] # reversed graph
# input graph data (edges)
for i in range(m):
u, v = list(map(int, input().strip().split()))
graph[u].append(v)
reversedGraph[v].append(u)
stack: list[int] = []
visit: list[bool] = [False] * n
scc: list[int] = []
component: list[int] = []
print(kosaraju())
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """ https://en.wikipedia.org/wiki/Rail_fence_cipher """
def encrypt(input_string: str, key: int) -> str:
"""
Shuffles the character of a string by placing each of them
in a grid (the height is dependent on the key) in a zigzag
formation and reading it left to right.
>>> encrypt("Hello World", 4)
'HWe olordll'
>>> encrypt("This is a message", 0)
Traceback (most recent call last):
...
ValueError: Height of grid can't be 0 or negative
>>> encrypt(b"This is a byte string", 5)
Traceback (most recent call last):
...
TypeError: sequence item 0: expected str instance, int found
"""
temp_grid: list[list[str]] = [[] for _ in range(key)]
lowest = key - 1
if key <= 0:
raise ValueError("Height of grid can't be 0 or negative")
if key == 1 or len(input_string) <= key:
return input_string
for position, character in enumerate(input_string):
num = position % (lowest * 2) # puts it in bounds
num = min(num, lowest * 2 - num) # creates zigzag pattern
temp_grid[num].append(character)
grid = ["".join(row) for row in temp_grid]
output_string = "".join(grid)
return output_string
def decrypt(input_string: str, key: int) -> str:
"""
Generates a template based on the key and fills it in with
the characters of the input string and then reading it in
a zigzag formation.
>>> decrypt("HWe olordll", 4)
'Hello World'
>>> decrypt("This is a message", -10)
Traceback (most recent call last):
...
ValueError: Height of grid can't be 0 or negative
>>> decrypt("My key is very big", 100)
'My key is very big'
"""
grid = []
lowest = key - 1
if key <= 0:
raise ValueError("Height of grid can't be 0 or negative")
if key == 1:
return input_string
temp_grid: list[list[str]] = [[] for _ in range(key)] # generates template
for position in range(len(input_string)):
num = position % (lowest * 2) # puts it in bounds
num = min(num, lowest * 2 - num) # creates zigzag pattern
temp_grid[num].append("*")
counter = 0
for row in temp_grid: # fills in the characters
splice = input_string[counter : counter + len(row)]
grid.append([character for character in splice])
counter += len(row)
output_string = "" # reads as zigzag
for position in range(len(input_string)):
num = position % (lowest * 2) # puts it in bounds
num = min(num, lowest * 2 - num) # creates zigzag pattern
output_string += grid[num][0]
grid[num].pop(0)
return output_string
def bruteforce(input_string: str) -> dict[int, str]:
"""Uses decrypt function by guessing every key
>>> bruteforce("HWe olordll")[4]
'Hello World'
"""
results = {}
for key_guess in range(1, len(input_string)): # tries every key
results[key_guess] = decrypt(input_string, key_guess)
return results
if __name__ == "__main__":
import doctest
doctest.testmod()
| """ https://en.wikipedia.org/wiki/Rail_fence_cipher """
def encrypt(input_string: str, key: int) -> str:
"""
Shuffles the character of a string by placing each of them
in a grid (the height is dependent on the key) in a zigzag
formation and reading it left to right.
>>> encrypt("Hello World", 4)
'HWe olordll'
>>> encrypt("This is a message", 0)
Traceback (most recent call last):
...
ValueError: Height of grid can't be 0 or negative
>>> encrypt(b"This is a byte string", 5)
Traceback (most recent call last):
...
TypeError: sequence item 0: expected str instance, int found
"""
temp_grid: list[list[str]] = [[] for _ in range(key)]
lowest = key - 1
if key <= 0:
raise ValueError("Height of grid can't be 0 or negative")
if key == 1 or len(input_string) <= key:
return input_string
for position, character in enumerate(input_string):
num = position % (lowest * 2) # puts it in bounds
num = min(num, lowest * 2 - num) # creates zigzag pattern
temp_grid[num].append(character)
grid = ["".join(row) for row in temp_grid]
output_string = "".join(grid)
return output_string
def decrypt(input_string: str, key: int) -> str:
"""
Generates a template based on the key and fills it in with
the characters of the input string and then reading it in
a zigzag formation.
>>> decrypt("HWe olordll", 4)
'Hello World'
>>> decrypt("This is a message", -10)
Traceback (most recent call last):
...
ValueError: Height of grid can't be 0 or negative
>>> decrypt("My key is very big", 100)
'My key is very big'
"""
grid = []
lowest = key - 1
if key <= 0:
raise ValueError("Height of grid can't be 0 or negative")
if key == 1:
return input_string
temp_grid: list[list[str]] = [[] for _ in range(key)] # generates template
for position in range(len(input_string)):
num = position % (lowest * 2) # puts it in bounds
num = min(num, lowest * 2 - num) # creates zigzag pattern
temp_grid[num].append("*")
counter = 0
for row in temp_grid: # fills in the characters
splice = input_string[counter : counter + len(row)]
grid.append([character for character in splice])
counter += len(row)
output_string = "" # reads as zigzag
for position in range(len(input_string)):
num = position % (lowest * 2) # puts it in bounds
num = min(num, lowest * 2 - num) # creates zigzag pattern
output_string += grid[num][0]
grid[num].pop(0)
return output_string
def bruteforce(input_string: str) -> dict[int, str]:
"""Uses decrypt function by guessing every key
>>> bruteforce("HWe olordll")[4]
'Hello World'
"""
results = {}
for key_guess in range(1, len(input_string)): # tries every key
results[key_guess] = decrypt(input_string, key_guess)
return results
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
||
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| 04/24/2020, 1279.31, 1640394, 1261.17, 1280.4, 1249.45
04/23/2020, 1276.31, 1566203, 1271.55, 1293.31, 1265.67
04/22/2020, 1263.21, 2093140, 1245.54, 1285.6133, 1242
04/21/2020, 1216.34, 2153003, 1247, 1254.27, 1209.71
04/20/2020, 1266.61, 1695488, 1271, 1281.6, 1261.37
04/17/2020, 1283.25, 1949042, 1284.85, 1294.43, 1271.23
04/16/2020, 1263.47, 2518099, 1274.1, 1279, 1242.62
04/15/2020, 1262.47, 1671703, 1245.61, 1280.46, 1240.4
04/14/2020, 1269.23, 2470353, 1245.09, 1282.07, 1236.93
04/13/2020, 1217.56, 1739828, 1209.18, 1220.51, 1187.5984
04/09/2020, 1211.45, 2175421, 1224.08, 1225.57, 1196.7351
04/08/2020, 1210.28, 1975135, 1206.5, 1219.07, 1188.16
04/07/2020, 1186.51, 2387329, 1221, 1225, 1182.23
04/06/2020, 1186.92, 2664723, 1138, 1194.66, 1130.94
04/03/2020, 1097.88, 2313400, 1119.015, 1123.54, 1079.81
04/02/2020, 1120.84, 1964881, 1098.26, 1126.86, 1096.4
04/01/2020, 1105.62, 2344173, 1122, 1129.69, 1097.45
03/31/2020, 1162.81, 2487983, 1147.3, 1175.31, 1138.14
03/30/2020, 1146.82, 2574061, 1125.04, 1151.63, 1096.48
03/27/2020, 1110.71, 3208495, 1125.67, 1150.6702, 1105.91
03/26/2020, 1161.75, 3573755, 1111.8, 1169.97, 1093.53
03/25/2020, 1102.49, 4081528, 1126.47, 1148.9, 1086.01
03/24/2020, 1134.46, 3344450, 1103.77, 1135, 1090.62
03/23/2020, 1056.62, 4044137, 1061.32, 1071.32, 1013.5361
03/20/2020, 1072.32, 3601750, 1135.72, 1143.99, 1065.49
03/19/2020, 1115.29, 3651106, 1093.05, 1157.9699, 1060.1075
03/18/2020, 1096.8, 4233435, 1056.51, 1106.5, 1037.28
03/17/2020, 1119.8, 3861489, 1093.11, 1130.86, 1056.01
03/16/2020, 1084.33, 4252365, 1096, 1152.2665, 1074.44
03/13/2020, 1219.73, 3700125, 1179, 1219.76, 1117.1432
03/12/2020, 1114.91, 4226748, 1126, 1193.87, 1113.3
03/11/2020, 1215.41, 2611229, 1249.7, 1260.96, 1196.07
03/10/2020, 1280.39, 2611373, 1260, 1281.15, 1218.77
03/09/2020, 1215.56, 3365365, 1205.3, 1254.7599, 1200
03/06/2020, 1298.41, 2660628, 1277.06, 1306.22, 1261.05
03/05/2020, 1319.04, 2561288, 1350.2, 1358.91, 1305.1
03/04/2020, 1386.52, 1913315, 1359.23, 1388.09, 1343.11
03/03/2020, 1341.39, 2402326, 1399.42, 1410.15, 1332
03/02/2020, 1389.11, 2431468, 1351.61, 1390.87, 1326.815
02/28/2020, 1339.33, 3790618, 1277.5, 1341.14, 1271
02/27/2020, 1318.09, 2978300, 1362.06, 1371.7037, 1317.17
02/26/2020, 1393.18, 2204037, 1396.14, 1415.7, 1379
02/25/2020, 1388.45, 2478278, 1433, 1438.14, 1382.4
02/24/2020, 1421.59, 2867053, 1426.11, 1436.97, 1411.39
02/21/2020, 1485.11, 1732273, 1508.03, 1512.215, 1480.44
02/20/2020, 1518.15, 1096552, 1522, 1529.64, 1506.82
02/19/2020, 1526.69, 949268, 1525.07, 1532.1063, 1521.4
02/18/2020, 1519.67, 1121140, 1515, 1531.63, 1512.59
02/14/2020, 1520.74, 1197836, 1515.6, 1520.74, 1507.34
02/13/2020, 1514.66, 929730, 1512.69, 1527.18, 1504.6
02/12/2020, 1518.27, 1167565, 1514.48, 1520.695, 1508.11
02/11/2020, 1508.79, 1344633, 1511.81, 1529.63, 1505.6378
02/10/2020, 1508.68, 1419876, 1474.32, 1509.5, 1474.32
02/07/2020, 1479.23, 1172270, 1467.3, 1485.84, 1466.35
02/06/2020, 1476.23, 1679384, 1450.33, 1481.9997, 1449.57
02/05/2020, 1448.23, 1986157, 1462.42, 1463.84, 1430.56
02/04/2020, 1447.07, 3932954, 1457.07, 1469.5, 1426.3
02/03/2020, 1485.94, 3055216, 1462, 1490, 1458.99
01/31/2020, 1434.23, 2417214, 1468.9, 1470.13, 1428.53
01/30/2020, 1455.84, 1339421, 1439.96, 1457.28, 1436.4
01/29/2020, 1458.63, 1078667, 1458.8, 1465.43, 1446.74
01/28/2020, 1452.56, 1577422, 1443, 1456, 1432.47
01/27/2020, 1433.9, 1755201, 1431, 1438.07, 1421.2
01/24/2020, 1466.71, 1784644, 1493.59, 1495.495, 1465.25
01/23/2020, 1486.65, 1351354, 1487.64, 1495.52, 1482.1
01/22/2020, 1485.95, 1610846, 1491, 1503.2143, 1484.93
01/21/2020, 1484.4, 2036780, 1479.12, 1491.85, 1471.2
01/17/2020, 1480.39, 2396215, 1462.91, 1481.2954, 1458.22
01/16/2020, 1451.7, 1173688, 1447.44, 1451.99, 1440.92
01/15/2020, 1439.2, 1282685, 1430.21, 1441.395, 1430.21
01/14/2020, 1430.88, 1560453, 1439.01, 1441.8, 1428.37
01/13/2020, 1439.23, 1653482, 1436.13, 1440.52, 1426.02
01/10/2020, 1429.73, 1821566, 1427.56, 1434.9292, 1418.35
01/09/2020, 1419.83, 1502664, 1420.57, 1427.33, 1410.27
01/08/2020, 1404.32, 1529177, 1392.08, 1411.58, 1390.84
01/07/2020, 1393.34, 1511693, 1397.94, 1402.99, 1390.38
01/06/2020, 1394.21, 1733149, 1350, 1396.5, 1350
01/03/2020, 1360.66, 1187006, 1347.86, 1372.5, 1345.5436
01/02/2020, 1367.37, 1406731, 1341.55, 1368.14, 1341.55
12/31/2019, 1337.02, 962468, 1330.11, 1338, 1329.085
12/30/2019, 1336.14, 1051323, 1350, 1353, 1334.02
12/27/2019, 1351.89, 1038718, 1362.99, 1364.53, 1349.31
12/26/2019, 1360.4, 667754, 1346.17, 1361.3269, 1344.47
12/24/2019, 1343.56, 347518, 1348.5, 1350.26, 1342.78
12/23/2019, 1348.84, 883200, 1355.87, 1359.7999, 1346.51
12/20/2019, 1349.59, 3316905, 1363.35, 1363.64, 1349
12/19/2019, 1356.04, 1470112, 1351.82, 1358.1, 1348.985
12/18/2019, 1352.62, 1657069, 1356.6, 1360.47, 1351
12/17/2019, 1355.12, 1855259, 1362.89, 1365, 1351.3231
12/16/2019, 1361.17, 1397451, 1356.5, 1364.68, 1352.67
12/13/2019, 1347.83, 1550028, 1347.95, 1353.0931, 1343.87
12/12/2019, 1350.27, 1281722, 1345.94, 1355.775, 1340.5
12/11/2019, 1345.02, 850796, 1350.84, 1351.2, 1342.67
12/10/2019, 1344.66, 1094653, 1341.5, 1349.975, 1336.04
12/09/2019, 1343.56, 1355795, 1338.04, 1359.45, 1337.84
12/06/2019, 1340.62, 1315510, 1333.44, 1344, 1333.44
12/05/2019, 1328.13, 1212818, 1328, 1329.3579, 1316.44
12/04/2019, 1320.54, 1538110, 1307.01, 1325.8, 1304.87
12/03/2019, 1295.28, 1268647, 1279.57, 1298.461, 1279
12/02/2019, 1289.92, 1511851, 1301, 1305.83, 1281
11/29/2019, 1304.96, 586981, 1307.12, 1310.205, 1303.97
11/27/2019, 1312.99, 996329, 1315, 1318.36, 1309.63
11/26/2019, 1313.55, 1069795, 1309.86, 1314.8, 1305.09
11/25/2019, 1306.69, 1036487, 1299.18, 1311.31, 1298.13
11/22/2019, 1295.34, 1386506, 1305.62, 1308.73, 1291.41
11/21/2019, 1301.35, 995499, 1301.48, 1312.59, 1293
11/20/2019, 1303.05, 1309835, 1311.74, 1315, 1291.15
11/19/2019, 1315.46, 1269372, 1327.7, 1327.7, 1312.8
11/18/2019, 1320.7, 1488083, 1332.22, 1335.5288, 1317.5
11/15/2019, 1334.87, 1782955, 1318.94, 1334.88, 1314.2796
11/14/2019, 1311.46, 1194305, 1297.5, 1317, 1295.65
11/13/2019, 1298, 853861, 1294.07, 1304.3, 1293.51
11/12/2019, 1298.8, 1085859, 1300, 1310, 1295.77
11/11/2019, 1299.19, 1012429, 1303.18, 1306.425, 1297.41
11/08/2019, 1311.37, 1251916, 1305.28, 1318, 1304.365
11/07/2019, 1308.86, 2029970, 1294.28, 1323.74, 1294.245
11/06/2019, 1291.8, 1152977, 1289.46, 1293.73, 1282.5
11/05/2019, 1292.03, 1282711, 1292.89, 1298.93, 1291.2289
11/04/2019, 1291.37, 1500964, 1276.45, 1294.13, 1276.355
11/01/2019, 1273.74, 1670072, 1265, 1274.62, 1260.5
10/31/2019, 1260.11, 1455651, 1261.28, 1267.67, 1250.8428
10/30/2019, 1261.29, 1408851, 1252.97, 1269.36, 1252
10/29/2019, 1262.62, 1886380, 1276.23, 1281.59, 1257.2119
10/28/2019, 1290, 2613237, 1275.45, 1299.31, 1272.54
10/25/2019, 1265.13, 1213051, 1251.03, 1269.6, 1250.01
10/24/2019, 1260.99, 1039868, 1260.9, 1264, 1253.715
10/23/2019, 1259.13, 928595, 1242.36, 1259.89, 1242.36
10/22/2019, 1242.8, 1047851, 1247.85, 1250.6, 1241.38
10/21/2019, 1246.15, 1038042, 1252.26, 1254.6287, 1240.6
10/18/2019, 1245.49, 1352839, 1253.46, 1258.89, 1241.08
10/17/2019, 1253.07, 980510, 1250.93, 1263.325, 1249.94
10/16/2019, 1243.64, 1168174, 1241.17, 1254.74, 1238.45
10/15/2019, 1243.01, 1395259, 1220.4, 1247.33, 1220.4
10/14/2019, 1217.14, 882039, 1212.34, 1226.33, 1211.76
10/11/2019, 1215.45, 1277144, 1222.21, 1228.39, 1213.74
10/10/2019, 1208.67, 932531, 1198.58, 1215, 1197.34
10/09/2019, 1202.31, 876632, 1199.35, 1208.35, 1197.63
10/08/2019, 1189.13, 1141784, 1197.59, 1206.08, 1189.01
10/07/2019, 1207.68, 867149, 1204.4, 1218.2036, 1203.75
10/04/2019, 1209, 1183264, 1191.89, 1211.44, 1189.17
10/03/2019, 1187.83, 1663656, 1180, 1189.06, 1162.43
10/02/2019, 1176.63, 1639237, 1196.98, 1196.99, 1171.29
10/01/2019, 1205.1, 1358279, 1219, 1231.23, 1203.58
09/30/2019, 1219, 1419676, 1220.97, 1226, 1212.3
09/27/2019, 1225.09, 1354432, 1243.01, 1244.02, 1214.45
09/26/2019, 1241.39, 1561882, 1241.96, 1245, 1232.268
09/25/2019, 1246.52, 1593875, 1215.82, 1248.3, 1210.09
09/24/2019, 1218.76, 1591786, 1240, 1246.74, 1210.68
09/23/2019, 1234.03, 1075253, 1226, 1239.09, 1224.17
09/20/2019, 1229.93, 2337269, 1233.12, 1243.32, 1223.08
09/19/2019, 1238.71, 1000155, 1232.06, 1244.44, 1232.02
09/18/2019, 1232.41, 1144333, 1227.51, 1235.61, 1216.53
09/17/2019, 1229.15, 958112, 1230.4, 1235, 1223.69
09/16/2019, 1231.3, 1053299, 1229.52, 1239.56, 1225.61
09/13/2019, 1239.56, 1301350, 1231.35, 1240.88, 1227.01
09/12/2019, 1234.25, 1725908, 1224.3, 1241.86, 1223.02
09/11/2019, 1220.17, 1307033, 1203.41, 1222.6, 1202.2
09/10/2019, 1206, 1260115, 1195.15, 1210, 1194.58
09/09/2019, 1204.41, 1471880, 1204, 1220, 1192.62
09/06/2019, 1204.93, 1072143, 1208.13, 1212.015, 1202.5222
09/05/2019, 1211.38, 1408601, 1191.53, 1213.04, 1191.53
09/04/2019, 1181.41, 1068968, 1176.71, 1183.48, 1171
09/03/2019, 1168.39, 1480420, 1177.03, 1186.89, 1163.2
08/30/2019, 1188.1, 1129959, 1198.5, 1198.5, 1183.8026
08/29/2019, 1192.85, 1088858, 1181.12, 1196.06, 1181.12
08/28/2019, 1171.02, 802243, 1161.71, 1176.4199, 1157.3
08/27/2019, 1167.84, 1077452, 1180.53, 1182.4, 1161.45
08/26/2019, 1168.89, 1226441, 1157.26, 1169.47, 1152.96
08/23/2019, 1151.29, 1688271, 1181.99, 1194.08, 1147.75
08/22/2019, 1189.53, 947906, 1194.07, 1198.0115, 1178.58
08/21/2019, 1191.25, 741053, 1193.15, 1199, 1187.43
08/20/2019, 1182.69, 915605, 1195.25, 1196.06, 1182.11
08/19/2019, 1198.45, 1232517, 1190.09, 1206.99, 1190.09
08/16/2019, 1177.6, 1349436, 1179.55, 1182.72, 1171.81
08/15/2019, 1167.26, 1224739, 1163.5, 1175.84, 1162.11
08/14/2019, 1164.29, 1578668, 1176.31, 1182.3, 1160.54
08/13/2019, 1197.27, 1318009, 1171.46, 1204.78, 1171.46
08/12/2019, 1174.71, 1003187, 1179.21, 1184.96, 1167.6723
08/09/2019, 1188.01, 1065658, 1197.99, 1203.88, 1183.603
08/08/2019, 1204.8, 1467997, 1182.83, 1205.01, 1173.02
08/07/2019, 1173.99, 1444324, 1156, 1178.4451, 1149.6239
08/06/2019, 1169.95, 1709374, 1163.31, 1179.96, 1160
08/05/2019, 1152.32, 2597455, 1170.04, 1175.24, 1140.14
08/02/2019, 1193.99, 1645067, 1200.74, 1206.9, 1188.94
08/01/2019, 1209.01, 1698510, 1214.03, 1234.11, 1205.72
07/31/2019, 1216.68, 1725454, 1223, 1234, 1207.7635
07/30/2019, 1225.14, 1453263, 1225.41, 1234.87, 1223.3
07/29/2019, 1239.41, 2223731, 1241.05, 1247.37, 1228.23
07/26/2019, 1250.41, 4805752, 1224.04, 1265.5499, 1224
07/25/2019, 1132.12, 2209823, 1137.82, 1141.7, 1120.92
07/24/2019, 1137.81, 1590101, 1131.9, 1144, 1126.99
07/23/2019, 1146.21, 1093688, 1144, 1146.9, 1131.8
07/22/2019, 1138.07, 1301846, 1133.45, 1139.25, 1124.24
07/19/2019, 1130.1, 1647245, 1148.19, 1151.14, 1129.62
07/18/2019, 1146.33, 1291281, 1141.74, 1147.605, 1132.73
07/17/2019, 1146.35, 1170047, 1150.97, 1158.36, 1145.77
07/16/2019, 1153.58, 1238807, 1146, 1158.58, 1145
07/15/2019, 1150.34, 903780, 1146.86, 1150.82, 1139.4
07/12/2019, 1144.9, 863973, 1143.99, 1147.34, 1138.78
07/11/2019, 1144.21, 1195569, 1143.25, 1153.07, 1139.58
07/10/2019, 1140.48, 1209466, 1131.22, 1142.05, 1130.97
07/09/2019, 1124.83, 1330370, 1111.8, 1128.025, 1107.17
07/08/2019, 1116.35, 1236419, 1125.17, 1125.98, 1111.21
07/05/2019, 1131.59, 1264540, 1117.8, 1132.88, 1116.14
07/03/2019, 1121.58, 767011, 1117.41, 1126.76, 1113.86
07/02/2019, 1111.25, 991755, 1102.24, 1111.77, 1098.17
07/01/2019, 1097.95, 1438504, 1098, 1107.58, 1093.703
06/28/2019, 1080.91, 1693450, 1076.39, 1081, 1073.37
06/27/2019, 1076.01, 1004477, 1084, 1087.1, 1075.29
06/26/2019, 1079.8, 1810869, 1086.5, 1092.97, 1072.24
06/25/2019, 1086.35, 1546913, 1112.66, 1114.35, 1083.8
06/24/2019, 1115.52, 1395696, 1119.61, 1122, 1111.01
06/21/2019, 1121.88, 1947591, 1109.24, 1124.11, 1108.08
06/20/2019, 1111.42, 1262011, 1119.99, 1120.12, 1104.74
06/19/2019, 1102.33, 1339218, 1105.6, 1107, 1093.48
06/18/2019, 1103.6, 1386684, 1109.69, 1116.39, 1098.99
06/17/2019, 1092.5, 941602, 1086.28, 1099.18, 1086.28
06/14/2019, 1085.35, 1111643, 1086.42, 1092.69, 1080.1721
06/13/2019, 1088.77, 1058000, 1083.64, 1094.17, 1080.15
06/12/2019, 1077.03, 1061255, 1078, 1080.93, 1067.54
06/11/2019, 1078.72, 1437063, 1093.98, 1101.99, 1077.6025
06/10/2019, 1080.38, 1464248, 1072.98, 1092.66, 1072.3216
06/07/2019, 1066.04, 1802370, 1050.63, 1070.92, 1048.4
06/06/2019, 1044.34, 1703244, 1044.99, 1047.49, 1033.7
06/05/2019, 1042.22, 2168439, 1051.54, 1053.55, 1030.49
06/04/2019, 1053.05, 2833483, 1042.9, 1056.05, 1033.69
06/03/2019, 1036.23, 5130576, 1065.5, 1065.5, 1025
05/31/2019, 1103.63, 1508203, 1101.29, 1109.6, 1100.18
05/30/2019, 1117.95, 951873, 1115.54, 1123.13, 1112.12
05/29/2019, 1116.46, 1538212, 1127.52, 1129.1, 1108.2201
05/28/2019, 1134.15, 1365166, 1134, 1151.5871, 1133.12
05/24/2019, 1133.47, 1112341, 1147.36, 1149.765, 1131.66
05/23/2019, 1140.77, 1199300, 1140.5, 1145.9725, 1129.224
05/22/2019, 1151.42, 914839, 1146.75, 1158.52, 1145.89
05/21/2019, 1149.63, 1160158, 1148.49, 1152.7077, 1137.94
05/20/2019, 1138.85, 1353292, 1144.5, 1146.7967, 1131.4425
05/17/2019, 1162.3, 1208623, 1168.47, 1180.15, 1160.01
05/16/2019, 1178.98, 1531404, 1164.51, 1188.16, 1162.84
05/15/2019, 1164.21, 2289302, 1117.87, 1171.33, 1116.6657
05/14/2019, 1120.44, 1836604, 1137.21, 1140.42, 1119.55
05/13/2019, 1132.03, 1860648, 1141.96, 1147.94, 1122.11
05/10/2019, 1164.27, 1314546, 1163.59, 1172.6, 1142.5
05/09/2019, 1162.38, 1185973, 1159.03, 1169.66, 1150.85
05/08/2019, 1166.27, 1309514, 1172.01, 1180.4243, 1165.74
05/07/2019, 1174.1, 1551368, 1180.47, 1190.44, 1161.04
05/06/2019, 1189.39, 1563943, 1166.26, 1190.85, 1166.26
05/03/2019, 1185.4, 1980653, 1173.65, 1186.8, 1169
05/02/2019, 1162.61, 1944817, 1167.76, 1174.1895, 1155.0018
05/01/2019, 1168.08, 2642983, 1188.05, 1188.05, 1167.18
04/30/2019, 1188.48, 6194691, 1185, 1192.81, 1175
04/29/2019, 1287.58, 2412788, 1274, 1289.27, 1266.2949
04/26/2019, 1272.18, 1228276, 1269, 1273.07, 1260.32
04/25/2019, 1263.45, 1099614, 1264.77, 1267.4083, 1252.03
04/24/2019, 1256, 1015006, 1264.12, 1268.01, 1255
04/23/2019, 1264.55, 1271195, 1250.69, 1269, 1246.38
04/22/2019, 1248.84, 806577, 1235.99, 1249.09, 1228.31
04/18/2019, 1236.37, 1315676, 1239.18, 1242, 1234.61
04/17/2019, 1236.34, 1211866, 1233, 1240.56, 1227.82
04/16/2019, 1227.13, 855258, 1225, 1230.82, 1220.12
04/15/2019, 1221.1, 1187353, 1218, 1224.2, 1209.1101
04/12/2019, 1217.87, 926799, 1210, 1218.35, 1208.11
04/11/2019, 1204.62, 709417, 1203.96, 1207.96, 1200.13
04/10/2019, 1202.16, 724524, 1200.68, 1203.785, 1196.435
04/09/2019, 1197.25, 865416, 1196, 1202.29, 1193.08
04/08/2019, 1203.84, 859969, 1207.89, 1208.69, 1199.86
04/05/2019, 1207.15, 900950, 1214.99, 1216.22, 1205.03
04/04/2019, 1215, 949962, 1205.94, 1215.67, 1204.13
04/03/2019, 1205.92, 1014195, 1207.48, 1216.3, 1200.5
04/02/2019, 1200.49, 800820, 1195.32, 1201.35, 1185.71
04/01/2019, 1194.43, 1188235, 1184.1, 1196.66, 1182
03/29/2019, 1173.31, 1269573, 1174.9, 1178.99, 1162.88
03/28/2019, 1168.49, 966843, 1171.54, 1171.565, 1159.4312
03/27/2019, 1173.02, 1362217, 1185.5, 1187.559, 1159.37
03/26/2019, 1184.62, 1894639, 1198.53, 1202.83, 1176.72
03/25/2019, 1193, 1493841, 1196.93, 1206.3975, 1187.04
03/22/2019, 1205.5, 1668910, 1226.32, 1230, 1202.825
03/21/2019, 1231.54, 1195899, 1216, 1231.79, 1213.15
03/20/2019, 1223.97, 2089367, 1197.35, 1227.14, 1196.17
03/19/2019, 1198.85, 1404863, 1188.81, 1200, 1185.87
03/18/2019, 1184.26, 1212506, 1183.3, 1190, 1177.4211
03/15/2019, 1184.46, 2457597, 1193.38, 1196.57, 1182.61
03/14/2019, 1185.55, 1150950, 1194.51, 1197.88, 1184.48
03/13/2019, 1193.32, 1434816, 1200.645, 1200.93, 1191.94
03/12/2019, 1193.2, 2012306, 1178.26, 1200, 1178.26
03/11/2019, 1175.76, 1569332, 1144.45, 1176.19, 1144.45
03/08/2019, 1142.32, 1212271, 1126.73, 1147.08, 1123.3
03/07/2019, 1143.3, 1166076, 1155.72, 1156.755, 1134.91
03/06/2019, 1157.86, 1094100, 1162.49, 1167.5658, 1155.49
03/05/2019, 1162.03, 1422357, 1150.06, 1169.61, 1146.195
03/04/2019, 1147.8, 1444774, 1146.99, 1158.2804, 1130.69
03/01/2019, 1140.99, 1447454, 1124.9, 1142.97, 1124.75
02/28/2019, 1119.92, 1541068, 1111.3, 1127.65, 1111.01
02/27/2019, 1116.05, 968362, 1106.95, 1117.98, 1101
02/26/2019, 1115.13, 1469761, 1105.75, 1119.51, 1099.92
02/25/2019, 1109.4, 1395281, 1116, 1118.54, 1107.27
02/22/2019, 1110.37, 1048361, 1100.9, 1111.24, 1095.6
02/21/2019, 1096.97, 1414744, 1110.84, 1111.94, 1092.52
02/20/2019, 1113.8, 1080144, 1119.99, 1123.41, 1105.28
02/19/2019, 1118.56, 1046315, 1110, 1121.89, 1110
02/15/2019, 1113.65, 1442461, 1130.08, 1131.67, 1110.65
02/14/2019, 1121.67, 941678, 1118.05, 1128.23, 1110.445
02/13/2019, 1120.16, 1048630, 1124.99, 1134.73, 1118.5
02/12/2019, 1121.37, 1608658, 1106.8, 1125.295, 1105.85
02/11/2019, 1095.01, 1063825, 1096.95, 1105.945, 1092.86
02/08/2019, 1095.06, 1072031, 1087, 1098.91, 1086.55
02/07/2019, 1098.71, 2040615, 1104.16, 1104.84, 1086
02/06/2019, 1115.23, 2101674, 1139.57, 1147, 1112.77
02/05/2019, 1145.99, 3529974, 1124.84, 1146.85, 1117.248
02/04/2019, 1132.8, 2518184, 1112.66, 1132.8, 1109.02
02/01/2019, 1110.75, 1455609, 1112.4, 1125, 1104.89
01/31/2019, 1116.37, 1531463, 1103, 1117.33, 1095.41
01/30/2019, 1089.06, 1241760, 1068.43, 1091, 1066.85
01/29/2019, 1060.62, 1006731, 1072.68, 1075.15, 1055.8647
01/28/2019, 1070.08, 1277745, 1080.11, 1083, 1063.8
01/25/2019, 1090.99, 1114785, 1085, 1094, 1081.82
01/24/2019, 1073.9, 1317718, 1076.48, 1079.475, 1060.7
01/23/2019, 1075.57, 956526, 1077.35, 1084.93, 1059.75
01/22/2019, 1070.52, 1607398, 1088, 1091.51, 1063.47
01/18/2019, 1098.26, 1933754, 1100, 1108.352, 1090.9
01/17/2019, 1089.9, 1223674, 1079.47, 1091.8, 1073.5
01/16/2019, 1080.97, 1320530, 1080, 1092.375, 1079.34
01/15/2019, 1077.15, 1452238, 1050.17, 1080.05, 1047.34
01/14/2019, 1044.69, 1127417, 1046.92, 1051.53, 1041.255
01/11/2019, 1057.19, 1512651, 1063.18, 1063.775, 1048.48
01/10/2019, 1070.33, 1444976, 1067.66, 1071.15, 1057.71
01/09/2019, 1074.66, 1198369, 1081.65, 1082.63, 1066.4
01/08/2019, 1076.28, 1748371, 1076.11, 1084.56, 1060.53
01/07/2019, 1068.39, 1978077, 1071.5, 1073.9999, 1054.76
01/04/2019, 1070.71, 2080144, 1032.59, 1070.84, 1027.4179
01/03/2019, 1016.06, 1829379, 1041, 1056.98, 1014.07
01/02/2019, 1045.85, 1516681, 1016.57, 1052.32, 1015.71
12/31/2018, 1035.61, 1492541, 1050.96, 1052.7, 1023.59
12/28/2018, 1037.08, 1399218, 1049.62, 1055.56, 1033.1
12/27/2018, 1043.88, 2102069, 1017.15, 1043.89, 997
12/26/2018, 1039.46, 2337212, 989.01, 1040, 983
12/24/2018, 976.22, 1590328, 973.9, 1003.54, 970.11
12/21/2018, 979.54, 4560424, 1015.3, 1024.02, 973.69
12/20/2018, 1009.41, 2659047, 1018.13, 1034.22, 996.36
12/19/2018, 1023.01, 2419322, 1033.99, 1062, 1008.05
12/18/2018, 1028.71, 2101854, 1026.09, 1049.48, 1021.44
12/17/2018, 1016.53, 2337631, 1037.51, 1053.15, 1007.9
12/14/2018, 1042.1, 1685802, 1049.98, 1062.6, 1040.79
12/13/2018, 1061.9, 1329198, 1068.07, 1079.7597, 1053.93
12/12/2018, 1063.68, 1523276, 1068, 1081.65, 1062.79
12/11/2018, 1051.75, 1354751, 1056.49, 1060.6, 1039.84
12/10/2018, 1039.55, 1793465, 1035.05, 1048.45, 1023.29
12/07/2018, 1036.58, 2098526, 1060.01, 1075.26, 1028.5
12/06/2018, 1068.73, 2758098, 1034.26, 1071.2, 1030.7701
12/04/2018, 1050.82, 2278200, 1103.12, 1104.42, 1049.98
12/03/2018, 1106.43, 1900355, 1123.14, 1124.65, 1103.6645
11/30/2018, 1094.43, 2554416, 1089.07, 1095.57, 1077.88
11/29/2018, 1088.3, 1403540, 1076.08, 1094.245, 1076
11/28/2018, 1086.23, 2399374, 1048.76, 1086.84, 1035.76
11/27/2018, 1044.41, 1801334, 1041, 1057.58, 1038.49
11/26/2018, 1048.62, 1846430, 1038.35, 1049.31, 1033.91
11/23/2018, 1023.88, 691462, 1030, 1037.59, 1022.3992
11/21/2018, 1037.61, 1531676, 1036.76, 1048.56, 1033.47
11/20/2018, 1025.76, 2447254, 1000, 1031.74, 996.02
11/19/2018, 1020, 1837207, 1057.2, 1060.79, 1016.2601
11/16/2018, 1061.49, 1641232, 1059.41, 1067, 1048.98
11/15/2018, 1064.71, 1819132, 1044.71, 1071.85, 1031.78
11/14/2018, 1043.66, 1561656, 1050, 1054.5643, 1031
11/13/2018, 1036.05, 1496534, 1043.29, 1056.605, 1031.15
11/12/2018, 1038.63, 1429319, 1061.39, 1062.12, 1031
11/09/2018, 1066.15, 1343154, 1073.99, 1075.56, 1053.11
11/08/2018, 1082.4, 1463022, 1091.38, 1093.27, 1072.2048
11/07/2018, 1093.39, 2057155, 1069, 1095.46, 1065.9
11/06/2018, 1055.81, 1225197, 1039.48, 1064.345, 1038.07
11/05/2018, 1040.09, 2436742, 1055, 1058.47, 1021.24
11/02/2018, 1057.79, 1829295, 1073.73, 1082.975, 1054.61
11/01/2018, 1070, 1456222, 1075.8, 1083.975, 1062.46
10/31/2018, 1076.77, 2528584, 1059.81, 1091.94, 1057
10/30/2018, 1036.21, 3209126, 1008.46, 1037.49, 1000.75
10/29/2018, 1020.08, 3873644, 1082.47, 1097.04, 995.83
10/26/2018, 1071.47, 4185201, 1037.03, 1106.53, 1034.09
10/25/2018, 1095.57, 2511884, 1071.79, 1110.98, 1069.55
10/24/2018, 1050.71, 1910060, 1104.25, 1106.12, 1048.74
10/23/2018, 1103.69, 1847798, 1080.89, 1107.89, 1070
10/22/2018, 1101.16, 1494285, 1103.06, 1112.23, 1091
10/19/2018, 1096.46, 1264605, 1093.37, 1110.36, 1087.75
10/18/2018, 1087.97, 2056606, 1121.84, 1121.84, 1077.09
10/17/2018, 1115.69, 1397613, 1126.46, 1128.99, 1102.19
10/16/2018, 1121.28, 1845491, 1104.59, 1124.22, 1102.5
10/15/2018, 1092.25, 1343231, 1108.91, 1113.4464, 1089
10/12/2018, 1110.08, 2029872, 1108, 1115, 1086.402
10/11/2018, 1079.32, 2939514, 1072.94, 1106.4, 1068.27
10/10/2018, 1081.22, 2574985, 1131.08, 1132.17, 1081.13
10/09/2018, 1138.82, 1308706, 1146.15, 1154.35, 1137.572
10/08/2018, 1148.97, 1877142, 1150.11, 1168, 1127.3636
10/05/2018, 1157.35, 1184245, 1167.5, 1173.4999, 1145.12
10/04/2018, 1168.19, 2151762, 1195.33, 1197.51, 1155.576
10/03/2018, 1202.95, 1207280, 1205, 1206.41, 1193.83
10/02/2018, 1200.11, 1655602, 1190.96, 1209.96, 1186.63
10/01/2018, 1195.31, 1345250, 1199.89, 1209.9, 1190.3
09/28/2018, 1193.47, 1306822, 1191.87, 1195.41, 1184.5
09/27/2018, 1194.64, 1244278, 1186.73, 1202.1, 1183.63
09/26/2018, 1180.49, 1346434, 1185.15, 1194.23, 1174.765
09/25/2018, 1184.65, 937577, 1176.15, 1186.88, 1168
09/24/2018, 1173.37, 1218532, 1157.17, 1178, 1146.91
09/21/2018, 1166.09, 4363929, 1192, 1192.21, 1166.04
09/20/2018, 1186.87, 1209855, 1179.99, 1189.89, 1173.36
09/19/2018, 1171.09, 1185321, 1164.98, 1173.21, 1154.58
09/18/2018, 1161.22, 1184407, 1157.09, 1176.08, 1157.09
09/17/2018, 1156.05, 1279147, 1170.14, 1177.24, 1154.03
09/14/2018, 1172.53, 934300, 1179.1, 1180.425, 1168.3295
09/13/2018, 1175.33, 1402005, 1170.74, 1178.61, 1162.85
09/12/2018, 1162.82, 1291304, 1172.72, 1178.61, 1158.36
09/11/2018, 1177.36, 1209171, 1161.63, 1178.68, 1156.24
09/10/2018, 1164.64, 1115259, 1172.19, 1174.54, 1160.11
09/07/2018, 1164.83, 1401034, 1158.67, 1175.26, 1157.215
09/06/2018, 1171.44, 1886690, 1186.3, 1186.3, 1152
09/05/2018, 1186.48, 2043732, 1193.8, 1199.0096, 1162
09/04/2018, 1197, 1800509, 1204.27, 1212.99, 1192.5
08/31/2018, 1218.19, 1812366, 1234.98, 1238.66, 1211.2854
08/30/2018, 1239.12, 1320261, 1244.23, 1253.635, 1232.59
08/29/2018, 1249.3, 1295939, 1237.45, 1250.66, 1236.3588
08/28/2018, 1231.15, 1296532, 1241.29, 1242.545, 1228.69
08/27/2018, 1241.82, 1154962, 1227.6, 1243.09, 1225.716
08/24/2018, 1220.65, 946529, 1208.82, 1221.65, 1206.3588
08/23/2018, 1205.38, 988509, 1207.14, 1221.28, 1204.24
08/22/2018, 1207.33, 881463, 1200, 1211.84, 1199
08/21/2018, 1201.62, 1187884, 1208, 1217.26, 1200.3537
08/20/2018, 1207.77, 864462, 1205.02, 1211, 1194.6264
08/17/2018, 1200.96, 1381724, 1202.03, 1209.02, 1188.24
08/16/2018, 1206.49, 1319985, 1224.73, 1225.9999, 1202.55
08/15/2018, 1214.38, 1815642, 1229.26, 1235.24, 1209.51
08/14/2018, 1242.1, 1342534, 1235.19, 1245.8695, 1225.11
08/13/2018, 1235.01, 957153, 1236.98, 1249.2728, 1233.6405
08/10/2018, 1237.61, 1107323, 1243, 1245.695, 1232
08/09/2018, 1249.1, 805227, 1249.9, 1255.542, 1246.01
08/08/2018, 1245.61, 1369650, 1240.47, 1256.5, 1238.0083
08/07/2018, 1242.22, 1493073, 1237, 1251.17, 1236.17
08/06/2018, 1224.77, 1080923, 1225, 1226.0876, 1215.7965
08/03/2018, 1223.71, 1072524, 1229.62, 1230, 1215.06
08/02/2018, 1226.15, 1520488, 1205.9, 1229.88, 1204.79
08/01/2018, 1220.01, 1567142, 1228, 1233.47, 1210.21
07/31/2018, 1217.26, 1632823, 1220.01, 1227.5877, 1205.6
07/30/2018, 1219.74, 1822782, 1228.01, 1234.916, 1211.47
07/27/2018, 1238.5, 2115802, 1271, 1273.89, 1231
07/26/2018, 1268.33, 2334881, 1251, 1269.7707, 1249.02
07/25/2018, 1263.7, 2115890, 1239.13, 1265.86, 1239.13
07/24/2018, 1248.08, 3303268, 1262.59, 1266, 1235.56
07/23/2018, 1205.5, 2584034, 1181.01, 1206.49, 1181
07/20/2018, 1184.91, 1246898, 1186.96, 1196.86, 1184.22
07/19/2018, 1186.96, 1256113, 1191, 1200, 1183.32
07/18/2018, 1195.88, 1391232, 1196.56, 1204.5, 1190.34
07/17/2018, 1198.8, 1585091, 1172.22, 1203.04, 1170.6
07/16/2018, 1183.86, 1049560, 1189.39, 1191, 1179.28
07/13/2018, 1188.82, 1221687, 1185, 1195.4173, 1180
07/12/2018, 1183.48, 1251083, 1159.89, 1184.41, 1155.935
07/11/2018, 1153.9, 1094301, 1144.59, 1164.29, 1141.0003
07/10/2018, 1152.84, 789249, 1156.98, 1159.59, 1149.59
07/09/2018, 1154.05, 906073, 1148.48, 1154.67, 1143.42
07/06/2018, 1140.17, 966155, 1123.58, 1140.93, 1120.7371
07/05/2018, 1124.27, 1060752, 1110.53, 1127.5, 1108.48
07/03/2018, 1102.89, 679034, 1135.82, 1135.82, 1100.02
07/02/2018, 1127.46, 1188616, 1099, 1128, 1093.8
06/29/2018, 1115.65, 1275979, 1120, 1128.2265, 1115
06/28/2018, 1114.22, 1072438, 1102.09, 1122.31, 1096.01
06/27/2018, 1103.98, 1287698, 1121.34, 1131.8362, 1103.62
06/26/2018, 1118.46, 1559791, 1128, 1133.21, 1116.6589
06/25/2018, 1124.81, 2155276, 1143.6, 1143.91, 1112.78
06/22/2018, 1155.48, 1310164, 1159.14, 1162.4965, 1147.26
06/21/2018, 1157.66, 1232352, 1174.85, 1177.295, 1152.232
06/20/2018, 1169.84, 1648248, 1175.31, 1186.2856, 1169.16
06/19/2018, 1168.06, 1616125, 1158.5, 1171.27, 1154.01
06/18/2018, 1173.46, 1400641, 1143.65, 1174.31, 1143.59
06/15/2018, 1152.26, 2119134, 1148.86, 1153.42, 1143.485
06/14/2018, 1152.12, 1350085, 1143.85, 1155.47, 1140.64
06/13/2018, 1134.79, 1490017, 1141.12, 1146.5, 1133.38
06/12/2018, 1139.32, 899231, 1131.07, 1139.79, 1130.735
06/11/2018, 1129.99, 1071114, 1118.6, 1137.26, 1118.6
06/08/2018, 1120.87, 1289859, 1118.18, 1126.67, 1112.15
06/07/2018, 1123.86, 1519860, 1131.32, 1135.82, 1116.52
06/06/2018, 1136.88, 1697489, 1142.17, 1143, 1125.7429
06/05/2018, 1139.66, 1538169, 1140.99, 1145.738, 1133.19
06/04/2018, 1139.29, 1881046, 1122.33, 1141.89, 1122.005
06/01/2018, 1119.5, 2416755, 1099.35, 1120, 1098.5
05/31/2018, 1084.99, 3085325, 1067.56, 1097.19, 1067.56
05/30/2018, 1067.8, 1129958, 1063.03, 1069.21, 1056.83
05/29/2018, 1060.32, 1858676, 1064.89, 1073.37, 1055.22
05/25/2018, 1075.66, 878903, 1079.02, 1082.56, 1073.775
05/24/2018, 1079.24, 757752, 1079, 1080.47, 1066.15
05/23/2018, 1079.69, 1057712, 1065.13, 1080.78, 1061.71
05/22/2018, 1069.73, 1088700, 1083.56, 1086.59, 1066.69
05/21/2018, 1079.58, 1012258, 1074.06, 1088, 1073.65
05/18/2018, 1066.36, 1496448, 1061.86, 1069.94, 1060.68
05/17/2018, 1078.59, 1031190, 1079.89, 1086.87, 1073.5
05/16/2018, 1081.77, 989819, 1077.31, 1089.27, 1076.26
05/15/2018, 1079.23, 1494306, 1090, 1090.05, 1073.47
05/14/2018, 1100.2, 1450140, 1100, 1110.75, 1099.11
05/11/2018, 1098.26, 1253205, 1093.6, 1101.3295, 1090.91
05/10/2018, 1097.57, 1441456, 1086.03, 1100.44, 1085.64
05/09/2018, 1082.76, 2032319, 1058.1, 1085.44, 1056.365
05/08/2018, 1053.91, 1217260, 1058.54, 1060.55, 1047.145
05/07/2018, 1054.79, 1464008, 1049.23, 1061.68, 1047.1
05/04/2018, 1048.21, 1936797, 1016.9, 1048.51, 1016.9
05/03/2018, 1023.72, 1813623, 1019, 1029.675, 1006.29
05/02/2018, 1024.38, 1534094, 1028.1, 1040.389, 1022.87
05/01/2018, 1037.31, 1427171, 1013.66, 1038.47, 1008.21
04/30/2018, 1017.33, 1664084, 1030.01, 1037, 1016.85
04/27/2018, 1030.05, 1617452, 1046, 1049.5, 1025.59
04/26/2018, 1040.04, 1984448, 1029.51, 1047.98, 1018.19
04/25/2018, 1021.18, 2225495, 1025.52, 1032.49, 1015.31
04/24/2018, 1019.98, 4750851, 1052, 1057, 1010.59
04/23/2018, 1067.45, 2278846, 1077.86, 1082.72, 1060.7
04/20/2018, 1072.96, 1887698, 1082, 1092.35, 1069.57
04/19/2018, 1087.7, 1741907, 1069.4, 1094.165, 1068.18
04/18/2018, 1072.08, 1336678, 1077.43, 1077.43, 1066.225
04/17/2018, 1074.16, 2311903, 1051.37, 1077.88, 1048.26
04/16/2018, 1037.98, 1194144, 1037, 1043.24, 1026.74
04/13/2018, 1029.27, 1175754, 1040.88, 1046.42, 1022.98
04/12/2018, 1032.51, 1357599, 1025.04, 1040.69, 1021.4347
04/11/2018, 1019.97, 1476133, 1027.99, 1031.3641, 1015.87
04/10/2018, 1031.64, 1983510, 1026.44, 1036.28, 1011.34
04/09/2018, 1015.45, 1738682, 1016.8, 1039.6, 1014.08
04/06/2018, 1007.04, 1740896, 1020, 1031.42, 1003.03
04/05/2018, 1027.81, 1345681, 1041.33, 1042.79, 1020.1311
04/04/2018, 1025.14, 2464418, 993.41, 1028.7175, 993
04/03/2018, 1013.41, 2271858, 1013.91, 1020.99, 994.07
04/02/2018, 1006.47, 2679214, 1022.82, 1034.8, 990.37
03/29/2018, 1031.79, 2714402, 1011.63, 1043, 1002.9
03/28/2018, 1004.56, 3345046, 998, 1024.23, 980.64
03/27/2018, 1005.1, 3081612, 1063, 1064.8393, 996.92
03/26/2018, 1053.21, 2593808, 1046, 1055.63, 1008.4
03/23/2018, 1021.57, 2147097, 1047.03, 1063.36, 1021.22
03/22/2018, 1049.08, 2584639, 1081.88, 1082.9, 1045.91
03/21/2018, 1090.88, 1878294, 1092.74, 1106.2999, 1085.15
03/20/2018, 1097.71, 1802209, 1099, 1105.2, 1083.46
03/19/2018, 1099.82, 2355186, 1120.01, 1121.99, 1089.01
03/16/2018, 1135.73, 2614871, 1154.14, 1155.88, 1131.96
03/15/2018, 1149.58, 1397767, 1149.96, 1161.08, 1134.54
03/14/2018, 1149.49, 1290638, 1145.21, 1158.59, 1141.44
03/13/2018, 1138.17, 1874176, 1170, 1176.76, 1133.33
03/12/2018, 1164.5, 2106548, 1163.85, 1177.05, 1157.42
03/09/2018, 1160.04, 2121425, 1136, 1160.8, 1132.4606
03/08/2018, 1126, 1393529, 1115.32, 1127.6, 1112.8
03/07/2018, 1109.64, 1277439, 1089.19, 1112.22, 1085.4823
03/06/2018, 1095.06, 1497087, 1099.22, 1101.85, 1089.775
03/05/2018, 1090.93, 1141932, 1075.14, 1097.1, 1069.0001
03/02/2018, 1078.92, 2271394, 1053.08, 1081.9986, 1048.115
03/01/2018, 1069.52, 2511872, 1107.87, 1110.12, 1067.001
02/28/2018, 1104.73, 1873737, 1123.03, 1127.53, 1103.24
02/27/2018, 1118.29, 1772866, 1141.24, 1144.04, 1118
02/26/2018, 1143.75, 1514920, 1127.8, 1143.96, 1126.695
02/23/2018, 1126.79, 1190432, 1112.64, 1127.28, 1104.7135
02/22/2018, 1106.63, 1309536, 1116.19, 1122.82, 1102.59
02/21/2018, 1111.34, 1507152, 1106.47, 1133.97, 1106.33
02/20/2018, 1102.46, 1389491, 1090.57, 1113.95, 1088.52
02/16/2018, 1094.8, 1680283, 1088.41, 1104.67, 1088.3134
02/15/2018, 1089.52, 1785552, 1079.07, 1091.4794, 1064.34
02/14/2018, 1069.7, 1547665, 1048.95, 1071.72, 1046.75
02/13/2018, 1052.1, 1213800, 1045, 1058.37, 1044.0872
02/12/2018, 1051.94, 2054002, 1048, 1061.5, 1040.928
02/09/2018, 1037.78, 3503970, 1017.25, 1043.97, 992.56
02/08/2018, 1001.52, 2809890, 1055.41, 1058.62, 1000.66
02/07/2018, 1048.58, 2353003, 1081.54, 1081.78, 1048.26
02/06/2018, 1080.6, 3432313, 1027.18, 1081.71, 1023.1367
02/05/2018, 1055.8, 3769453, 1090.6, 1110, 1052.03
02/02/2018, 1111.9, 4837979, 1122, 1123.07, 1107.2779
02/01/2018, 1167.7, 2380221, 1162.61, 1174, 1157.52
01/31/2018, 1169.94, 1523820, 1170.57, 1173, 1159.13
01/30/2018, 1163.69, 1541771, 1167.83, 1176.52, 1163.52
01/29/2018, 1175.58, 1337324, 1176.48, 1186.89, 1171.98
01/26/2018, 1175.84, 1981173, 1175.08, 1175.84, 1158.11
01/25/2018, 1170.37, 1461518, 1172.53, 1175.94, 1162.76
01/24/2018, 1164.24, 1382904, 1177.33, 1179.86, 1161.05
01/23/2018, 1169.97, 1309862, 1159.85, 1171.6266, 1158.75
01/22/2018, 1155.81, 1616120, 1137.49, 1159.88, 1135.1101
01/19/2018, 1137.51, 1390118, 1131.83, 1137.86, 1128.3
01/18/2018, 1129.79, 1194943, 1131.41, 1132.51, 1117.5
01/17/2018, 1131.98, 1200476, 1126.22, 1132.6, 1117.01
01/16/2018, 1121.76, 1566662, 1132.51, 1139.91, 1117.8316
01/12/2018, 1122.26, 1718491, 1102.41, 1124.29, 1101.15
01/11/2018, 1105.52, 977727, 1106.3, 1106.525, 1099.59
01/10/2018, 1102.61, 1042273, 1097.1, 1104.6, 1096.11
01/09/2018, 1106.26, 900089, 1109.4, 1110.57, 1101.2307
01/08/2018, 1106.94, 1046767, 1102.23, 1111.27, 1101.62
01/05/2018, 1102.23, 1279990, 1094, 1104.25, 1092
01/04/2018, 1086.4, 1002945, 1088, 1093.5699, 1084.0017
01/03/2018, 1082.48, 1429757, 1064.31, 1086.29, 1063.21
01/02/2018, 1065, 1236401, 1048.34, 1066.94, 1045.23
12/29/2017, 1046.4, 886845, 1046.72, 1049.7, 1044.9
12/28/2017, 1048.14, 833011, 1051.6, 1054.75, 1044.77
12/27/2017, 1049.37, 1271780, 1057.39, 1058.37, 1048.05
12/26/2017, 1056.74, 761097, 1058.07, 1060.12, 1050.2
12/22/2017, 1060.12, 755089, 1061.11, 1064.2, 1059.44
12/21/2017, 1063.63, 986548, 1064.95, 1069.33, 1061.7938
12/20/2017, 1064.95, 1268285, 1071.78, 1073.38, 1061.52
12/19/2017, 1070.68, 1307894, 1075.2, 1076.84, 1063.55
12/18/2017, 1077.14, 1552016, 1066.08, 1078.49, 1062
12/15/2017, 1064.19, 3275091, 1054.61, 1067.62, 1049.5
12/14/2017, 1049.15, 1558684, 1045, 1058.5, 1043.11
12/13/2017, 1040.61, 1220364, 1046.12, 1046.665, 1038.38
12/12/2017, 1040.48, 1279511, 1039.63, 1050.31, 1033.6897
12/11/2017, 1041.1, 1190527, 1035.5, 1043.8, 1032.0504
12/08/2017, 1037.05, 1288419, 1037.49, 1042.05, 1032.5222
12/07/2017, 1030.93, 1458145, 1020.43, 1034.24, 1018.071
12/06/2017, 1018.38, 1258496, 1001.5, 1024.97, 1001.14
12/05/2017, 1005.15, 2066247, 995.94, 1020.61, 988.28
12/04/2017, 998.68, 1906058, 1012.66, 1016.1, 995.57
12/01/2017, 1010.17, 1908962, 1015.8, 1022.4897, 1002.02
11/30/2017, 1021.41, 1723003, 1022.37, 1028.4899, 1015
11/29/2017, 1021.66, 2442974, 1042.68, 1044.08, 1015.65
11/28/2017, 1047.41, 1421027, 1055.09, 1062.375, 1040
11/27/2017, 1054.21, 1307471, 1040, 1055.46, 1038.44
11/24/2017, 1040.61, 536996, 1035.87, 1043.178, 1035
11/22/2017, 1035.96, 746351, 1035, 1039.706, 1031.43
11/21/2017, 1034.49, 1096161, 1023.31, 1035.11, 1022.655
11/20/2017, 1018.38, 898389, 1020.26, 1022.61, 1017.5
11/17/2017, 1019.09, 1366936, 1034.01, 1034.42, 1017.75
11/16/2017, 1032.5, 1129424, 1022.52, 1035.92, 1022.52
11/15/2017, 1020.91, 847932, 1019.21, 1024.09, 1015.42
11/14/2017, 1026, 958708, 1022.59, 1026.81, 1014.15
11/13/2017, 1025.75, 885565, 1023.42, 1031.58, 1022.57
11/10/2017, 1028.07, 720674, 1026.46, 1030.76, 1025.28
11/09/2017, 1031.26, 1244701, 1033.99, 1033.99, 1019.6656
11/08/2017, 1039.85, 1088395, 1030.52, 1043.522, 1028.45
11/07/2017, 1033.33, 1112123, 1027.27, 1033.97, 1025.13
11/06/2017, 1025.9, 1124757, 1028.99, 1034.87, 1025
11/03/2017, 1032.48, 1075134, 1022.11, 1032.65, 1020.31
11/02/2017, 1025.58, 1048584, 1021.76, 1028.09, 1013.01
11/01/2017, 1025.5, 1371619, 1017.21, 1029.67, 1016.95
10/31/2017, 1016.64, 1331265, 1015.22, 1024, 1010.42
10/30/2017, 1017.11, 2083490, 1014, 1024.97, 1007.5
10/27/2017, 1019.27, 5165922, 1009.19, 1048.39, 1008.2
10/26/2017, 972.56, 2027218, 980, 987.6, 972.2
10/25/2017, 973.33, 1210368, 968.37, 976.09, 960.5201
10/24/2017, 970.54, 1206074, 970, 972.23, 961
10/23/2017, 968.45, 1471544, 989.52, 989.52, 966.12
10/20/2017, 988.2, 1176177, 989.44, 991, 984.58
10/19/2017, 984.45, 1312706, 986, 988.88, 978.39
10/18/2017, 992.81, 1057285, 991.77, 996.72, 986.9747
10/17/2017, 992.18, 1290152, 990.29, 996.44, 988.59
10/16/2017, 992, 910246, 992.1, 993.9065, 984
10/13/2017, 989.68, 1169584, 992, 997.21, 989
10/12/2017, 987.83, 1278357, 987.45, 994.12, 985
10/11/2017, 989.25, 1692843, 973.72, 990.71, 972.25
10/10/2017, 972.6, 968113, 980, 981.57, 966.0801
10/09/2017, 977, 890620, 980, 985.425, 976.11
10/06/2017, 978.89, 1146207, 966.7, 979.46, 963.36
10/05/2017, 969.96, 1210427, 955.49, 970.91, 955.18
10/04/2017, 951.68, 951766, 957, 960.39, 950.69
10/03/2017, 957.79, 888303, 954, 958, 949.14
10/02/2017, 953.27, 1282850, 959.98, 962.54, 947.84
09/29/2017, 959.11, 1576365, 952, 959.7864, 951.51
09/28/2017, 949.5, 997036, 941.36, 950.69, 940.55
09/27/2017, 944.49, 2237538, 927.74, 949.9, 927.74
09/26/2017, 924.86, 1666749, 923.72, 930.82, 921.14
09/25/2017, 920.97, 1855742, 925.45, 926.4, 909.7
09/22/2017, 928.53, 1052170, 927.75, 934.73, 926.48
09/21/2017, 932.45, 1227059, 933, 936.53, 923.83
09/20/2017, 931.58, 1535626, 922.98, 933.88, 922
09/19/2017, 921.81, 912967, 917.42, 922.4199, 912.55
09/18/2017, 915, 1300759, 920.01, 922.08, 910.6
09/15/2017, 920.29, 2499466, 924.66, 926.49, 916.36
09/14/2017, 925.11, 1395497, 931.25, 932.77, 924
09/13/2017, 935.09, 1101145, 930.66, 937.25, 929.86
09/12/2017, 932.07, 1133638, 932.59, 933.48, 923.861
09/11/2017, 929.08, 1266020, 934.25, 938.38, 926.92
09/08/2017, 926.5, 997699, 936.49, 936.99, 924.88
09/07/2017, 935.95, 1211472, 931.73, 936.41, 923.62
09/06/2017, 927.81, 1526209, 930.15, 930.915, 919.27
09/05/2017, 928.45, 1346791, 933.08, 937, 921.96
09/01/2017, 937.34, 943657, 941.13, 942.48, 935.15
08/31/2017, 939.33, 1566888, 931.76, 941.98, 931.76
08/30/2017, 929.57, 1300616, 920.05, 930.819, 919.65
08/29/2017, 921.29, 1181391, 905.1, 923.33, 905
08/28/2017, 913.81, 1085014, 916, 919.245, 911.87
08/25/2017, 915.89, 1052764, 923.49, 925.555, 915.5
08/24/2017, 921.28, 1266191, 928.66, 930.84, 915.5
08/23/2017, 927, 1088575, 921.93, 929.93, 919.36
08/22/2017, 924.69, 1166320, 912.72, 925.86, 911.4751
08/21/2017, 906.66, 942328, 910, 913, 903.4
08/18/2017, 910.67, 1341990, 910.31, 915.275, 907.1543
08/17/2017, 910.98, 1241782, 925.78, 926.86, 910.98
08/16/2017, 926.96, 1005261, 925.29, 932.7, 923.445
08/15/2017, 922.22, 882479, 924.23, 926.5499, 919.82
08/14/2017, 922.67, 1063404, 922.53, 924.668, 918.19
08/11/2017, 914.39, 1205652, 907.97, 917.78, 905.58
08/10/2017, 907.24, 1755521, 917.55, 919.26, 906.13
08/09/2017, 922.9, 1191332, 920.61, 925.98, 917.2501
08/08/2017, 926.79, 1057351, 927.09, 935.814, 925.6095
08/07/2017, 929.36, 1031710, 929.06, 931.7, 926.5
08/04/2017, 927.96, 1081814, 926.75, 930.3068, 923.03
08/03/2017, 923.65, 1201519, 930.34, 932.24, 922.24
08/02/2017, 930.39, 1822272, 928.61, 932.6, 916.68
08/01/2017, 930.83, 1234612, 932.38, 937.447, 929.26
07/31/2017, 930.5, 1964748, 941.89, 943.59, 926.04
07/28/2017, 941.53, 1802343, 929.4, 943.83, 927.5
07/27/2017, 934.09, 3128819, 951.78, 951.78, 920
07/26/2017, 947.8, 2069349, 954.68, 955, 942.2788
07/25/2017, 950.7, 4656609, 953.81, 959.7, 945.4
07/24/2017, 980.34, 3205374, 972.22, 986.2, 970.77
07/21/2017, 972.92, 1697190, 962.25, 973.23, 960.15
07/20/2017, 968.15, 1620636, 975, 975.9, 961.51
07/19/2017, 970.89, 1221155, 967.84, 973.04, 964.03
07/18/2017, 965.4, 1152741, 953, 968.04, 950.6
07/17/2017, 953.42, 1164141, 957, 960.74, 949.2407
07/14/2017, 955.99, 1052855, 952, 956.91, 948.005
07/13/2017, 947.16, 1294674, 946.29, 954.45, 943.01
07/12/2017, 943.83, 1517168, 938.68, 946.3, 934.47
07/11/2017, 930.09, 1112417, 929.54, 931.43, 922
07/10/2017, 928.8, 1190237, 921.77, 930.38, 919.59
07/07/2017, 918.59, 1590456, 908.85, 921.54, 908.85
07/06/2017, 906.69, 1424290, 904.12, 914.9444, 899.7
07/05/2017, 911.71, 1813309, 901.76, 914.51, 898.5
07/03/2017, 898.7, 1710373, 912.18, 913.94, 894.79
06/30/2017, 908.73, 2086340, 926.05, 926.05, 908.31
06/29/2017, 917.79, 3287991, 929.92, 931.26, 910.62
06/28/2017, 940.49, 2719213, 929, 942.75, 916
06/27/2017, 927.33, 2566047, 942.46, 948.29, 926.85
06/26/2017, 952.27, 1596664, 969.9, 973.31, 950.79
06/23/2017, 965.59, 1527513, 956.83, 966, 954.2
06/22/2017, 957.09, 941639, 958.7, 960.72, 954.55
06/21/2017, 959.45, 1201971, 953.64, 960.1, 950.76
06/20/2017, 950.63, 1125520, 957.52, 961.62, 950.01
06/19/2017, 957.37, 1520715, 949.96, 959.99, 949.05
06/16/2017, 939.78, 3061794, 940, 942.04, 931.595
06/15/2017, 942.31, 2065271, 933.97, 943.339, 924.44
06/14/2017, 950.76, 1487378, 959.92, 961.15, 942.25
06/13/2017, 953.4, 2012980, 951.91, 959.98, 944.09
06/12/2017, 942.9, 3762434, 939.56, 949.355, 915.2328
06/09/2017, 949.83, 3305545, 984.5, 984.5, 935.63
06/08/2017, 983.41, 1477151, 982.35, 984.57, 977.2
06/07/2017, 981.08, 1447172, 979.65, 984.15, 975.77
06/06/2017, 976.57, 1814323, 983.16, 988.25, 975.14
06/05/2017, 983.68, 1251903, 976.55, 986.91, 975.1
06/02/2017, 975.6, 1750723, 969.46, 975.88, 966
06/01/2017, 966.95, 1408958, 968.95, 971.5, 960.01
05/31/2017, 964.86, 2447176, 975.02, 979.27, 960.18
05/30/2017, 975.88, 1466288, 970.31, 976.2, 969.49
05/26/2017, 971.47, 1251425, 969.7, 974.98, 965.03
05/25/2017, 969.54, 1659422, 957.33, 972.629, 955.47
05/24/2017, 954.96, 1031408, 952.98, 955.09, 949.5
05/23/2017, 948.82, 1269438, 947.92, 951.4666, 942.575
05/22/2017, 941.86, 1118456, 935, 941.8828, 935
05/19/2017, 934.01, 1389848, 931.47, 937.755, 931
05/18/2017, 930.24, 1596058, 921, 933.17, 918.75
05/17/2017, 919.62, 2357922, 935.67, 939.3325, 918.14
05/16/2017, 943, 968288, 940, 943.11, 937.58
05/15/2017, 937.08, 1104595, 932.95, 938.25, 929.34
05/12/2017, 932.22, 1050377, 931.53, 933.44, 927.85
05/11/2017, 930.6, 834997, 925.32, 932.53, 923.0301
05/10/2017, 928.78, 1173887, 931.98, 932, 925.16
05/09/2017, 932.17, 1581236, 936.95, 937.5, 929.53
05/08/2017, 934.3, 1328885, 926.12, 936.925, 925.26
05/05/2017, 927.13, 1910317, 933.54, 934.9, 925.2
05/04/2017, 931.66, 1421938, 926.07, 935.93, 924.59
05/03/2017, 927.04, 1497565, 914.86, 928.1, 912.5426
05/02/2017, 916.44, 1543696, 909.62, 920.77, 909.4526
05/01/2017, 912.57, 2114629, 901.94, 915.68, 901.45
04/28/2017, 905.96, 3223850, 910.66, 916.85, 905.77
04/27/2017, 874.25, 2009509, 873.6, 875.4, 870.38
04/26/2017, 871.73, 1233724, 874.23, 876.05, 867.7481
04/25/2017, 872.3, 1670095, 865, 875, 862.81
04/24/2017, 862.76, 1371722, 851.2, 863.45, 849.86
04/21/2017, 843.19, 1323364, 842.88, 843.88, 840.6
04/20/2017, 841.65, 957994, 841.44, 845.2, 839.32
04/19/2017, 838.21, 954324, 839.79, 842.22, 836.29
04/18/2017, 836.82, 835433, 834.22, 838.93, 832.71
04/17/2017, 837.17, 894540, 825.01, 837.75, 824.47
04/13/2017, 823.56, 1118221, 822.14, 826.38, 821.44
04/12/2017, 824.32, 900059, 821.93, 826.66, 821.02
04/11/2017, 823.35, 1078951, 824.71, 827.4267, 817.0201
04/10/2017, 824.73, 978825, 825.39, 829.35, 823.77
04/07/2017, 824.67, 1056692, 827.96, 828.485, 820.5127
04/06/2017, 827.88, 1254235, 832.4, 836.39, 826.46
04/05/2017, 831.41, 1553163, 835.51, 842.45, 830.72
04/04/2017, 834.57, 1044455, 831.36, 835.18, 829.0363
04/03/2017, 838.55, 1670349, 829.22, 840.85, 829.22
03/31/2017, 829.56, 1401756, 828.97, 831.64, 827.39
03/30/2017, 831.5, 1055263, 833.5, 833.68, 829
03/29/2017, 831.41, 1785006, 825, 832.765, 822.3801
03/28/2017, 820.92, 1620532, 820.41, 825.99, 814.027
03/27/2017, 819.51, 1894735, 806.95, 821.63, 803.37
03/24/2017, 814.43, 1980415, 820.08, 821.93, 808.89
03/23/2017, 817.58, 3485390, 821, 822.57, 812.257
03/22/2017, 829.59, 1399409, 831.91, 835.55, 827.1801
03/21/2017, 830.46, 2461375, 851.4, 853.5, 829.02
03/20/2017, 848.4, 1217560, 850.01, 850.22, 845.15
03/17/2017, 852.12, 1712397, 851.61, 853.4, 847.11
03/16/2017, 848.78, 977384, 849.03, 850.85, 846.13
03/15/2017, 847.2, 1381328, 847.59, 848.63, 840.77
03/14/2017, 845.62, 779920, 843.64, 847.24, 840.8
03/13/2017, 845.54, 1149928, 844, 848.685, 843.25
03/10/2017, 843.25, 1702731, 843.28, 844.91, 839.5
03/09/2017, 838.68, 1261393, 836, 842, 834.21
03/08/2017, 835.37, 988900, 833.51, 838.15, 831.79
03/07/2017, 831.91, 1037573, 827.4, 833.41, 826.52
03/06/2017, 827.78, 1108799, 826.95, 828.88, 822.4
03/03/2017, 829.08, 890640, 830.56, 831.36, 825.751
03/02/2017, 830.63, 937824, 833.85, 834.51, 829.64
03/01/2017, 835.24, 1495934, 828.85, 836.255, 827.26
02/28/2017, 823.21, 2258695, 825.61, 828.54, 820.2
02/27/2017, 829.28, 1101120, 824.55, 830.5, 824
02/24/2017, 828.64, 1392039, 827.73, 829, 824.2
02/23/2017, 831.33, 1471342, 830.12, 832.46, 822.88
02/22/2017, 830.76, 983058, 828.66, 833.25, 828.64
02/21/2017, 831.66, 1259841, 828.66, 833.45, 828.35
02/17/2017, 828.07, 1602549, 823.02, 828.07, 821.655
02/16/2017, 824.16, 1285919, 819.93, 824.4, 818.98
02/15/2017, 818.98, 1311316, 819.36, 823, 818.47
02/14/2017, 820.45, 1054472, 819, 823, 816
02/13/2017, 819.24, 1205835, 816, 820.959, 815.49
02/10/2017, 813.67, 1134701, 811.7, 815.25, 809.78
02/09/2017, 809.56, 990260, 809.51, 810.66, 804.54
02/08/2017, 808.38, 1155892, 807, 811.84, 803.1903
02/07/2017, 806.97, 1240257, 803.99, 810.5, 801.78
02/06/2017, 801.34, 1182882, 799.7, 801.67, 795.2501
02/03/2017, 801.49, 1461217, 802.99, 806, 800.37
02/02/2017, 798.53, 1530827, 793.8, 802.7, 792
02/01/2017, 795.695, 2027708, 799.68, 801.19, 791.19
01/31/2017, 796.79, 2153957, 796.86, 801.25, 790.52
01/30/2017, 802.32, 3243568, 814.66, 815.84, 799.8
01/27/2017, 823.31, 2964989, 834.71, 841.95, 820.44
01/26/2017, 832.15, 2944642, 837.81, 838, 827.01
01/25/2017, 835.67, 1612854, 829.62, 835.77, 825.06
01/24/2017, 823.87, 1472228, 822.3, 825.9, 817.821
01/23/2017, 819.31, 1962506, 807.25, 820.87, 803.74
01/20/2017, 805.02, 1668638, 806.91, 806.91, 801.69
01/19/2017, 802.175, 917085, 805.12, 809.48, 801.8
01/18/2017, 806.07, 1293893, 805.81, 806.205, 800.99
01/17/2017, 804.61, 1361935, 807.08, 807.14, 800.37
01/13/2017, 807.88, 1098154, 807.48, 811.2244, 806.69
01/12/2017, 806.36, 1352872, 807.14, 807.39, 799.17
01/11/2017, 807.91, 1065360, 805, 808.15, 801.37
01/10/2017, 804.79, 1176637, 807.86, 809.1299, 803.51
01/09/2017, 806.65, 1274318, 806.4, 809.9664, 802.83
01/06/2017, 806.15, 1639246, 795.26, 807.9, 792.2041
01/05/2017, 794.02, 1334028, 786.08, 794.48, 785.02
01/04/2017, 786.9, 1071198, 788.36, 791.34, 783.16
01/03/2017, 786.14, 1657291, 778.81, 789.63, 775.8
12/30/2016, 771.82, 1769809, 782.75, 782.78, 770.41
12/29/2016, 782.79, 743808, 783.33, 785.93, 778.92
12/28/2016, 785.05, 1142148, 793.7, 794.23, 783.2
12/27/2016, 791.55, 789151, 790.68, 797.86, 787.657
12/23/2016, 789.91, 623682, 790.9, 792.74, 787.28
12/22/2016, 791.26, 972147, 792.36, 793.32, 788.58
12/21/2016, 794.56, 1208770, 795.84, 796.6757, 787.1
12/20/2016, 796.42, 950345, 796.76, 798.65, 793.27
12/19/2016, 794.2, 1231966, 790.22, 797.66, 786.27
12/16/2016, 790.8, 2435100, 800.4, 800.8558, 790.29
12/15/2016, 797.85, 1623709, 797.34, 803, 792.92
12/14/2016, 797.07, 1700875, 797.4, 804, 794.01
12/13/2016, 796.1, 2122735, 793.9, 804.3799, 793.34
12/12/2016, 789.27, 2102288, 785.04, 791.25, 784.3554
12/09/2016, 789.29, 1821146, 780, 789.43, 779.021
12/08/2016, 776.42, 1487517, 772.48, 778.18, 767.23
12/07/2016, 771.19, 1757710, 761, 771.36, 755.8
12/06/2016, 759.11, 1690365, 764.73, 768.83, 757.34
12/05/2016, 762.52, 1393566, 757.71, 763.9, 752.9
12/02/2016, 750.5, 1452181, 744.59, 754, 743.1
12/01/2016, 747.92, 3017001, 757.44, 759.85, 737.0245
11/30/2016, 758.04, 2386628, 770.07, 772.99, 754.83
11/29/2016, 770.84, 1616427, 771.53, 778.5, 768.24
11/28/2016, 768.24, 2177039, 760, 779.53, 759.8
11/25/2016, 761.68, 587421, 764.26, 765, 760.52
11/23/2016, 760.99, 1477501, 767.73, 768.2825, 755.25
11/22/2016, 768.27, 1592372, 772.63, 776.96, 767
11/21/2016, 769.2, 1324431, 762.61, 769.7, 760.6
11/18/2016, 760.54, 1528555, 771.37, 775, 760
11/17/2016, 771.23, 1298484, 766.92, 772.7, 764.23
11/16/2016, 764.48, 1468196, 755.2, 766.36, 750.51
11/15/2016, 758.49, 2375056, 746.97, 764.4162, 746.97
11/14/2016, 736.08, 3644965, 755.6, 757.85, 727.54
11/11/2016, 754.02, 2421889, 756.54, 760.78, 750.38
11/10/2016, 762.56, 4733916, 791.17, 791.17, 752.18
11/09/2016, 785.31, 2603860, 779.94, 791.2265, 771.67
11/08/2016, 790.51, 1361472, 783.4, 795.633, 780.19
11/07/2016, 782.52, 1574426, 774.5, 785.19, 772.55
11/04/2016, 762.02, 2131948, 750.66, 770.36, 750.5611
11/03/2016, 762.13, 1933937, 767.25, 769.95, 759.03
11/02/2016, 768.7, 1905814, 778.2, 781.65, 763.4496
11/01/2016, 783.61, 2404898, 782.89, 789.49, 775.54
10/31/2016, 784.54, 2420892, 795.47, 796.86, 784
10/28/2016, 795.37, 4261912, 808.35, 815.49, 793.59
10/27/2016, 795.35, 2723097, 801, 803.49, 791.5
10/26/2016, 799.07, 1645403, 806.34, 806.98, 796.32
10/25/2016, 807.67, 1575020, 816.68, 816.68, 805.14
10/24/2016, 813.11, 1693162, 804.9, 815.18, 804.82
10/21/2016, 799.37, 1262042, 795, 799.5, 794
10/20/2016, 796.97, 1755546, 803.3, 803.97, 796.03
10/19/2016, 801.56, 1762990, 798.86, 804.63, 797.635
10/18/2016, 795.26, 2046338, 787.85, 801.61, 785.565
10/17/2016, 779.96, 1091524, 779.8, 785.85, 777.5
10/14/2016, 778.53, 851512, 781.65, 783.95, 776
10/13/2016, 778.19, 1360619, 781.22, 781.22, 773
10/12/2016, 786.14, 935138, 783.76, 788.13, 782.06
10/11/2016, 783.07, 1371461, 786.66, 792.28, 780.58
10/10/2016, 785.94, 1161410, 777.71, 789.38, 775.87
10/07/2016, 775.08, 932444, 779.66, 779.66, 770.75
10/06/2016, 776.86, 1066910, 779, 780.48, 775.54
10/05/2016, 776.47, 1457661, 779.31, 782.07, 775.65
10/04/2016, 776.43, 1198361, 776.03, 778.71, 772.89
10/03/2016, 772.56, 1276614, 774.25, 776.065, 769.5
09/30/2016, 777.29, 1583293, 776.33, 780.94, 774.09
09/29/2016, 775.01, 1310252, 781.44, 785.8, 774.232
09/28/2016, 781.56, 1108249, 777.85, 781.81, 774.97
09/27/2016, 783.01, 1152760, 775.5, 785.9899, 774.308
09/26/2016, 774.21, 1531788, 782.74, 782.74, 773.07
09/23/2016, 786.9, 1411439, 786.59, 788.93, 784.15
09/22/2016, 787.21, 1483899, 780, 789.85, 778.44
09/21/2016, 776.22, 1166290, 772.66, 777.16, 768.301
09/20/2016, 771.41, 975434, 769, 773.33, 768.53
09/19/2016, 765.7, 1171969, 772.42, 774, 764.4406
09/16/2016, 768.88, 2047036, 769.75, 769.75, 764.66
09/15/2016, 771.76, 1344945, 762.89, 773.8, 759.96
09/14/2016, 762.49, 1093723, 759.61, 767.68, 759.11
09/13/2016, 759.69, 1394158, 764.48, 766.2195, 755.8
09/12/2016, 769.02, 1310493, 755.13, 770.29, 754.0001
09/09/2016, 759.66, 1879903, 770.1, 773.245, 759.66
09/08/2016, 775.32, 1268663, 778.59, 780.35, 773.58
09/07/2016, 780.35, 893874, 780, 782.73, 776.2
09/06/2016, 780.08, 1441864, 773.45, 782, 771
09/02/2016, 771.46, 1070725, 773.01, 773.9199, 768.41
09/01/2016, 768.78, 925019, 769.25, 771.02, 764.3
08/31/2016, 767.05, 1247937, 767.01, 769.09, 765.38
08/30/2016, 769.09, 1129932, 769.33, 774.466, 766.84
08/29/2016, 772.15, 847537, 768.74, 774.99, 766.615
08/26/2016, 769.54, 1164713, 769, 776.0799, 765.85
08/25/2016, 769.41, 926856, 767, 771.89, 763.1846
08/24/2016, 769.64, 1071569, 770.58, 774.5, 767.07
08/23/2016, 772.08, 925356, 775.48, 776.44, 771.785
08/22/2016, 772.15, 950417, 773.27, 774.54, 770.0502
08/19/2016, 775.42, 860899, 775, 777.1, 773.13
08/18/2016, 777.5, 718882, 780.01, 782.86, 777
08/17/2016, 779.91, 921666, 777.32, 780.81, 773.53
08/16/2016, 777.14, 1027836, 780.3, 780.98, 773.444
08/15/2016, 782.44, 938183, 783.75, 787.49, 780.11
08/12/2016, 783.22, 739761, 781.5, 783.395, 780.4
08/11/2016, 784.85, 971742, 785, 789.75, 782.97
08/10/2016, 784.68, 784559, 783.75, 786.8123, 782.778
08/09/2016, 784.26, 1318457, 781.1, 788.94, 780.57
08/08/2016, 781.76, 1106693, 782, 782.63, 778.091
08/05/2016, 782.22, 1799478, 773.78, 783.04, 772.34
08/04/2016, 771.61, 1139972, 772.22, 774.07, 768.795
08/03/2016, 773.18, 1283186, 767.18, 773.21, 766.82
08/02/2016, 771.07, 1782822, 768.69, 775.84, 767.85
08/01/2016, 772.88, 2697699, 761.09, 780.43, 761.09
07/29/2016, 768.79, 3830103, 772.71, 778.55, 766.77
07/28/2016, 745.91, 3473040, 747.04, 748.65, 739.3
07/27/2016, 741.77, 1509133, 738.28, 744.46, 737
07/26/2016, 738.42, 1182993, 739.04, 741.69, 734.27
07/25/2016, 739.77, 1031643, 740.67, 742.61, 737.5
07/22/2016, 742.74, 1256741, 741.86, 743.24, 736.56
07/21/2016, 738.63, 1022229, 740.36, 741.69, 735.831
07/20/2016, 741.19, 1283931, 737.33, 742.13, 737.1
07/19/2016, 736.96, 1225467, 729.89, 736.99, 729
07/18/2016, 733.78, 1284740, 722.71, 736.13, 721.19
07/15/2016, 719.85, 1277514, 725.73, 725.74, 719.055
07/14/2016, 720.95, 949456, 721.58, 722.21, 718.03
07/13/2016, 716.98, 933352, 723.62, 724, 716.85
07/12/2016, 720.64, 1336112, 719.12, 722.94, 715.91
07/11/2016, 715.09, 1107039, 708.05, 716.51, 707.24
07/08/2016, 705.63, 1573909, 699.5, 705.71, 696.435
07/07/2016, 695.36, 1303661, 698.08, 698.2, 688.215
07/06/2016, 697.77, 1411080, 689.98, 701.68, 689.09
07/05/2016, 694.49, 1462879, 696.06, 696.94, 688.88
07/01/2016, 699.21, 1344387, 692.2, 700.65, 692.1301
06/30/2016, 692.1, 1597298, 685.47, 692.32, 683.65
06/29/2016, 684.11, 1931436, 683, 687.4292, 681.41
06/28/2016, 680.04, 2169704, 678.97, 680.33, 673
06/27/2016, 668.26, 2632011, 671, 672.3, 663.284
06/24/2016, 675.22, 4442943, 675.17, 689.4, 673.45
06/23/2016, 701.87, 2166183, 697.45, 701.95, 687
06/22/2016, 697.46, 1182161, 699.06, 700.86, 693.0819
06/21/2016, 695.94, 1464836, 698.4, 702.77, 692.01
06/20/2016, 693.71, 2080645, 698.77, 702.48, 693.41
06/17/2016, 691.72, 3397720, 708.65, 708.82, 688.4515
06/16/2016, 710.36, 1981657, 714.91, 716.65, 703.26
06/15/2016, 718.92, 1213386, 719, 722.98, 717.31
06/14/2016, 718.27, 1303808, 716.48, 722.47, 713.12
06/13/2016, 718.36, 1255199, 716.51, 725.44, 716.51
06/10/2016, 719.41, 1213989, 719.47, 725.89, 716.43
06/09/2016, 728.58, 987635, 722.87, 729.54, 722.3361
06/08/2016, 728.28, 1583325, 723.96, 728.57, 720.58
06/07/2016, 716.65, 1336348, 719.84, 721.98, 716.55
06/06/2016, 716.55, 1565955, 724.91, 724.91, 714.61
06/03/2016, 722.34, 1225924, 729.27, 729.49, 720.56
06/02/2016, 730.4, 1340664, 732.5, 733.02, 724.17
06/01/2016, 734.15, 1251468, 734.53, 737.21, 730.66
05/31/2016, 735.72, 2128358, 731.74, 739.73, 731.26
05/27/2016, 732.66, 1974425, 724.01, 733.936, 724
05/26/2016, 724.12, 1573635, 722.87, 728.33, 720.28
05/25/2016, 725.27, 1629790, 720.76, 727.51, 719.7047
05/24/2016, 720.09, 1926828, 706.86, 720.97, 706.86
05/23/2016, 704.24, 1326386, 706.53, 711.4781, 704.18
05/20/2016, 709.74, 1825830, 701.62, 714.58, 700.52
05/19/2016, 700.32, 1668887, 702.36, 706, 696.8
05/18/2016, 706.63, 1765632, 703.67, 711.6, 700.63
05/17/2016, 706.23, 1999883, 715.99, 721.52, 704.11
05/16/2016, 716.49, 1316719, 709.13, 718.48, 705.65
05/13/2016, 710.83, 1307559, 711.93, 716.6619, 709.26
05/12/2016, 713.31, 1361170, 717.06, 719.25, 709
05/11/2016, 715.29, 1690862, 723.41, 724.48, 712.8
05/10/2016, 723.18, 1568621, 716.75, 723.5, 715.72
05/09/2016, 712.9, 1509892, 712, 718.71, 710
05/06/2016, 711.12, 1828508, 698.38, 711.86, 698.1067
05/05/2016, 701.43, 1680220, 697.7, 702.3199, 695.72
05/04/2016, 695.7, 1692757, 690.49, 699.75, 689.01
05/03/2016, 692.36, 1541297, 696.87, 697.84, 692
05/02/2016, 698.21, 1645013, 697.63, 700.64, 691
04/29/2016, 693.01, 2486584, 690.7, 697.62, 689
04/28/2016, 691.02, 2859790, 708.26, 714.17, 689.55
04/27/2016, 705.84, 3094905, 707.29, 708.98, 692.3651
04/26/2016, 708.14, 2739133, 725.42, 725.766, 703.0264
04/25/2016, 723.15, 1956956, 716.1, 723.93, 715.59
04/22/2016, 718.77, 5949699, 726.3, 736.12, 713.61
04/21/2016, 759.14, 2995094, 755.38, 760.45, 749.55
04/20/2016, 752.67, 1526776, 758, 758.1315, 750.01
04/19/2016, 753.93, 2027962, 769.51, 769.9, 749.33
04/18/2016, 766.61, 1557199, 760.46, 768.05, 757.3
04/15/2016, 759, 1807062, 753.98, 761, 752.6938
04/14/2016, 753.2, 1134056, 754.01, 757.31, 752.705
04/13/2016, 751.72, 1707397, 749.16, 754.38, 744.261
04/12/2016, 743.09, 1349780, 738, 743.83, 731.01
04/11/2016, 736.1, 1218789, 743.02, 745, 736.05
04/08/2016, 739.15, 1289869, 743.97, 745.45, 735.55
04/07/2016, 740.28, 1452369, 745.37, 746.9999, 736.28
04/06/2016, 745.69, 1052171, 735.77, 746.24, 735.56
04/05/2016, 737.8, 1130817, 738, 742.8, 735.37
04/04/2016, 745.29, 1134214, 750.06, 752.8, 742.43
04/01/2016, 749.91, 1576240, 738.6, 750.34, 737
03/31/2016, 744.95, 1718638, 749.25, 750.85, 740.94
03/30/2016, 750.53, 1782278, 750.1, 757.88, 748.74
03/29/2016, 744.77, 1902254, 734.59, 747.25, 728.76
03/28/2016, 733.53, 1300817, 736.79, 738.99, 732.5
03/24/2016, 735.3, 1570474, 732.01, 737.747, 731
03/23/2016, 738.06, 1431130, 742.36, 745.7199, 736.15
03/22/2016, 740.75, 1269263, 737.46, 745, 737.46
03/21/2016, 742.09, 1835963, 736.5, 742.5, 733.5157
03/18/2016, 737.6, 2982194, 741.86, 742, 731.83
03/17/2016, 737.78, 1859562, 736.45, 743.07, 736
03/16/2016, 736.09, 1621412, 726.37, 737.47, 724.51
03/15/2016, 728.33, 1720790, 726.92, 732.29, 724.77
03/14/2016, 730.49, 1717002, 726.81, 735.5, 725.15
03/11/2016, 726.82, 1968164, 720, 726.92, 717.125
03/10/2016, 712.82, 2830630, 708.12, 716.44, 703.36
03/09/2016, 705.24, 1419661, 698.47, 705.68, 694
03/08/2016, 693.97, 2075305, 688.59, 703.79, 685.34
03/07/2016, 695.16, 2986064, 706.9, 708.0912, 686.9
03/04/2016, 710.89, 1971379, 714.99, 716.49, 706.02
03/03/2016, 712.42, 1956958, 718.68, 719.45, 706.02
03/02/2016, 718.85, 1629501, 719, 720, 712
03/01/2016, 718.81, 2148608, 703.62, 718.81, 699.77
02/29/2016, 697.77, 2478214, 700.32, 710.89, 697.68
02/26/2016, 705.07, 2241785, 708.58, 713.43, 700.86
02/25/2016, 705.75, 1640430, 700.01, 705.98, 690.585
02/24/2016, 699.56, 1961258, 688.92, 700, 680.78
02/23/2016, 695.85, 2006572, 701.45, 708.4, 693.58
02/22/2016, 706.46, 1949046, 707.45, 713.24, 702.51
02/19/2016, 700.91, 1585152, 695.03, 703.0805, 694.05
02/18/2016, 697.35, 1880306, 710, 712.35, 696.03
02/17/2016, 708.4, 2490021, 699, 709.75, 691.38
02/16/2016, 691, 2517324, 692.98, 698, 685.05
02/12/2016, 682.4, 2138937, 690.26, 693.75, 678.6
02/11/2016, 683.11, 3021587, 675, 689.35, 668.8675
02/10/2016, 684.12, 2629130, 686.86, 701.31, 682.13
02/09/2016, 678.11, 3605792, 672.32, 699.9, 668.77
02/08/2016, 682.74, 4241416, 667.85, 684.03, 663.06
02/05/2016, 683.57, 5098357, 703.87, 703.99, 680.15
02/04/2016, 708.01, 5157988, 722.81, 727, 701.86
02/03/2016, 726.95, 6166731, 770.22, 774.5, 720.5
02/02/2016, 764.65, 6340548, 784.5, 789.8699, 764.65
02/01/2016, 752, 5065235, 750.46, 757.86, 743.27
01/29/2016, 742.95, 3464432, 731.53, 744.9899, 726.8
01/28/2016, 730.96, 2664956, 722.22, 733.69, 712.35
01/27/2016, 699.99, 2175913, 713.67, 718.235, 694.39
01/26/2016, 713.04, 1329141, 713.85, 718.28, 706.48
01/25/2016, 711.67, 1709777, 723.58, 729.68, 710.01
01/22/2016, 725.25, 2009951, 723.6, 728.13, 720.121
01/21/2016, 706.59, 2411079, 702.18, 719.19, 694.46
01/20/2016, 698.45, 3441642, 688.61, 706.85, 673.26
01/19/2016, 701.79, 2264747, 703.3, 709.98, 693.4101
01/15/2016, 694.45, 3604137, 692.29, 706.74, 685.37
01/14/2016, 714.72, 2225495, 705.38, 721.925, 689.1
01/13/2016, 700.56, 2497086, 730.85, 734.74, 698.61
01/12/2016, 726.07, 2010026, 721.68, 728.75, 717.3165
01/11/2016, 716.03, 2089495, 716.61, 718.855, 703.54
01/08/2016, 714.47, 2449420, 731.45, 733.23, 713
01/07/2016, 726.39, 2960578, 730.31, 738.5, 719.06
01/06/2016, 743.62, 1943685, 730, 747.18, 728.92
01/05/2016, 742.58, 1949386, 746.45, 752, 738.64
01/04/2016, 741.84, 3271348, 743, 744.06, 731.2577
12/31/2015, 758.88, 1500129, 769.5, 769.5, 758.34
12/30/2015, 771, 1293514, 776.6, 777.6, 766.9
12/29/2015, 776.6, 1764044, 766.69, 779.98, 766.43
12/28/2015, 762.51, 1515574, 752.92, 762.99, 749.52
12/24/2015, 748.4, 527223, 749.55, 751.35, 746.62
12/23/2015, 750.31, 1566723, 753.47, 754.21, 744
12/22/2015, 750, 1365420, 751.65, 754.85, 745.53
12/21/2015, 747.77, 1524535, 746.13, 750, 740
12/18/2015, 739.31, 3140906, 746.51, 754.13, 738.15
12/17/2015, 749.43, 1551087, 762.42, 762.68, 749
12/16/2015, 758.09, 1986319, 750, 760.59, 739.435
12/15/2015, 743.4, 2661199, 753, 758.08, 743.01
12/14/2015, 747.77, 2417778, 741.79, 748.73, 724.17
12/11/2015, 738.87, 2223284, 741.16, 745.71, 736.75
12/10/2015, 749.46, 1988035, 752.85, 755.85, 743.83
12/09/2015, 751.61, 2697978, 759.17, 764.23, 737.001
12/08/2015, 762.37, 1829004, 757.89, 764.8, 754.2
12/07/2015, 763.25, 1811336, 767.77, 768.73, 755.09
12/04/2015, 766.81, 2756194, 753.1, 768.49, 750
12/03/2015, 752.54, 2589641, 766.01, 768.995, 745.63
12/02/2015, 762.38, 2196721, 768.9, 775.955, 758.96
12/01/2015, 767.04, 2131827, 747.11, 768.95, 746.7
11/30/2015, 742.6, 2045584, 748.81, 754.93, 741.27
11/27/2015, 750.26, 838528, 748.46, 753.41, 747.49
11/25/2015, 748.15, 1122224, 748.14, 752, 746.06
11/24/2015, 748.28, 2333700, 752, 755.279, 737.63
11/23/2015, 755.98, 1414640, 757.45, 762.7075, 751.82
11/20/2015, 756.6, 2212934, 746.53, 757.92, 743
11/19/2015, 738.41, 1327265, 738.74, 742, 737.43
11/18/2015, 740, 1683978, 727.58, 741.41, 727
11/17/2015, 725.3, 1507449, 729.29, 731.845, 723.027
11/16/2015, 728.96, 1904395, 715.6, 729.49, 711.33
11/13/2015, 717, 2072392, 729.17, 731.15, 716.73
11/12/2015, 731.23, 1836567, 731, 737.8, 728.645
11/11/2015, 735.4, 1366611, 732.46, 741, 730.23
11/10/2015, 728.32, 1606499, 724.4, 730.59, 718.5001
11/09/2015, 724.89, 2068920, 730.2, 734.71, 719.43
11/06/2015, 733.76, 1510586, 731.5, 735.41, 727.01
11/05/2015, 731.25, 1861100, 729.47, 739.48, 729.47
11/04/2015, 728.11, 1705745, 722, 733.1, 721.9
11/03/2015, 722.16, 1565355, 718.86, 724.65, 714.72
11/02/2015, 721.11, 1885155, 711.06, 721.62, 705.85
10/30/2015, 710.81, 1907732, 715.73, 718, 710.05
10/29/2015, 716.92, 1455508, 710.5, 718.26, 710.01
10/28/2015, 712.95, 2178841, 707.33, 712.98, 703.08
10/27/2015, 708.49, 2232183, 707.38, 713.62, 704.55
10/26/2015, 712.78, 2709292, 701.55, 719.15, 701.26
10/23/2015, 702, 6651909, 727.5, 730, 701.5
10/22/2015, 651.79, 3994360, 646.7, 657.8, 644.01
10/21/2015, 642.61, 1792869, 654.15, 655.87, 641.73
10/20/2015, 650.28, 2498077, 664.04, 664.7197, 644.195
10/19/2015, 666.1, 1465691, 661.18, 666.82, 659.58
10/16/2015, 662.2, 1610712, 664.11, 664.97, 657.2
10/15/2015, 661.74, 1832832, 654.66, 663.13, 654.46
10/14/2015, 651.16, 1413798, 653.21, 659.39, 648.85
10/13/2015, 652.3, 1806003, 643.15, 657.8125, 643.15
10/12/2015, 646.67, 1275565, 642.09, 648.5, 639.01
10/09/2015, 643.61, 1648656, 640, 645.99, 635.318
10/08/2015, 639.16, 2181990, 641.36, 644.45, 625.56
10/07/2015, 642.36, 2092536, 649.24, 650.609, 632.15
10/06/2015, 645.44, 2235078, 638.84, 649.25, 636.5295
10/05/2015, 641.47, 1802263, 632, 643.01, 627
10/02/2015, 626.91, 2681241, 607.2, 627.34, 603.13
10/01/2015, 611.29, 1866223, 608.37, 612.09, 599.85
09/30/2015, 608.42, 2412754, 603.28, 608.76, 600.73
09/29/2015, 594.97, 2310065, 597.28, 605, 590.22
09/28/2015, 594.89, 3118693, 610.34, 614.605, 589.38
09/25/2015, 611.97, 2173134, 629.77, 629.77, 611
09/24/2015, 625.8, 2238097, 616.64, 627.32, 612.4
09/23/2015, 622.36, 1470633, 622.05, 628.93, 620
09/22/2015, 622.69, 2561551, 627, 627.55, 615.43
09/21/2015, 635.44, 1786543, 634.4, 636.49, 625.94
09/18/2015, 629.25, 5123314, 636.79, 640, 627.02
09/17/2015, 642.9, 2259404, 637.79, 650.9, 635.02
09/16/2015, 635.98, 1276250, 635.47, 637.95, 632.32
09/15/2015, 635.14, 2082426, 626.7, 638.7, 623.78
09/14/2015, 623.24, 1701618, 625.7, 625.86, 619.43
09/11/2015, 625.77, 1372803, 619.75, 625.78, 617.42
09/10/2015, 621.35, 1903334, 613.1, 624.16, 611.43
09/09/2015, 612.72, 1699686, 621.22, 626.52, 609.6
09/08/2015, 614.66, 2277487, 612.49, 616.31, 604.12
09/04/2015, 600.7, 2087028, 600, 603.47, 595.25
09/03/2015, 606.25, 1757851, 617, 619.71, 602.8213
09/02/2015, 614.34, 2573982, 605.59, 614.34, 599.71
09/01/2015, 597.79, 3699844, 602.36, 612.86, 594.1
08/31/2015, 618.25, 2172168, 627.54, 635.8, 617.68
08/28/2015, 630.38, 1975818, 632.82, 636.88, 624.56
08/27/2015, 637.61, 3485906, 639.4, 643.59, 622
08/26/2015, 628.62, 4187276, 610.35, 631.71, 599.05
08/25/2015, 582.06, 3521916, 614.91, 617.45, 581.11
08/24/2015, 589.61, 5727282, 573, 614, 565.05
08/21/2015, 612.48, 4261666, 639.78, 640.05, 612.33
08/20/2015, 646.83, 2854028, 655.46, 662.99, 642.9
08/19/2015, 660.9, 2132265, 656.6, 667, 654.19
08/18/2015, 656.13, 1455664, 661.9, 664, 653.46
08/17/2015, 660.87, 1050553, 656.8, 661.38, 651.24
08/14/2015, 657.12, 1071333, 655.01, 659.855, 652.66
08/13/2015, 656.45, 1807182, 659.323, 664.5, 651.661
08/12/2015, 659.56, 2938651, 663.08, 665, 652.29
08/11/2015, 660.78, 5016425, 669.2, 674.9, 654.27
08/10/2015, 633.73, 1653836, 639.48, 643.44, 631.249
08/07/2015, 635.3, 1403441, 640.23, 642.68, 629.71
08/06/2015, 642.68, 1572150, 645, 645.379, 632.25
08/05/2015, 643.78, 2331720, 634.33, 647.86, 633.16
08/04/2015, 629.25, 1486858, 628.42, 634.81, 627.16
08/03/2015, 631.21, 1301439, 625.34, 633.0556, 625.34
07/31/2015, 625.61, 1705286, 631.38, 632.91, 625.5
07/30/2015, 632.59, 1472286, 630, 635.22, 622.05
07/29/2015, 631.93, 1573146, 628.8, 633.36, 622.65
07/28/2015, 628, 1713684, 632.83, 632.83, 623.31
07/27/2015, 627.26, 2673801, 621, 634.3, 620.5
07/24/2015, 623.56, 3622089, 647, 648.17, 622.52
07/23/2015, 644.28, 3014035, 661.27, 663.63, 641
07/22/2015, 662.1, 3707818, 660.89, 678.64, 659
07/21/2015, 662.3, 3363342, 655.21, 673, 654.3
07/20/2015, 663.02, 5857092, 659.24, 668.88, 653.01
07/17/2015, 672.93, 11153500, 649, 674.468, 645
07/16/2015, 579.85, 4559712, 565.12, 580.68, 565
07/15/2015, 560.22, 1782264, 560.13, 566.5029, 556.79
07/14/2015, 561.1, 3231284, 546.76, 565.8487, 546.71
07/13/2015, 546.55, 2204610, 532.88, 547.11, 532.4001
07/10/2015, 530.13, 1954951, 526.29, 532.56, 525.55
07/09/2015, 520.68, 1840155, 523.12, 523.77, 520.35
07/08/2015, 516.83, 1293372, 521.05, 522.734, 516.11
07/07/2015, 525.02, 1595672, 523.13, 526.18, 515.18
07/06/2015, 522.86, 1278587, 519.5, 525.25, 519
07/02/2015, 523.4, 1235773, 521.08, 524.65, 521.08
07/01/2015, 521.84, 1961197, 524.73, 525.69, 518.2305
06/30/2015, 520.51, 2234284, 526.02, 526.25, 520.5
06/29/2015, 521.52, 1935361, 525.01, 528.61, 520.54
06/26/2015, 531.69, 2108629, 537.26, 537.76, 531.35
06/25/2015, 535.23, 1332412, 538.87, 540.9, 535.23
06/24/2015, 537.84, 1286576, 540, 540, 535.66
06/23/2015, 540.48, 1196115, 539.64, 541.499, 535.25
06/22/2015, 538.19, 1243535, 539.59, 543.74, 537.53
06/19/2015, 536.69, 1890916, 537.21, 538.25, 533.01
06/18/2015, 536.73, 1832450, 531, 538.15, 530.79
06/17/2015, 529.26, 1269113, 529.37, 530.98, 525.1
06/16/2015, 528.15, 1071728, 528.4, 529.6399, 525.56
06/15/2015, 527.2, 1632675, 528, 528.3, 524
06/12/2015, 532.33, 955489, 531.6, 533.12, 530.16
06/11/2015, 534.61, 1208632, 538.425, 538.98, 533.02
06/10/2015, 536.69, 1813775, 529.36, 538.36, 529.35
06/09/2015, 526.69, 1454172, 527.56, 529.2, 523.01
06/08/2015, 526.83, 1523960, 533.31, 534.12, 526.24
06/05/2015, 533.33, 1375008, 536.35, 537.2, 532.52
06/04/2015, 536.7, 1346044, 537.76, 540.59, 534.32
06/03/2015, 540.31, 1716836, 539.91, 543.5, 537.11
06/02/2015, 539.18, 1936721, 532.93, 543, 531.33
06/01/2015, 533.99, 1900257, 536.79, 536.79, 529.76
05/29/2015, 532.11, 2590445, 537.37, 538.63, 531.45
05/28/2015, 539.78, 1029764, 538.01, 540.61, 536.25
05/27/2015, 539.79, 1524783, 532.8, 540.55, 531.71
05/26/2015, 532.32, 2404462, 538.12, 539, 529.88
05/22/2015, 540.11, 1175065, 540.15, 544.19, 539.51
05/21/2015, 542.51, 1461431, 537.95, 543.8399, 535.98
05/20/2015, 539.27, 1430565, 538.49, 542.92, 532.972
05/19/2015, 537.36, 1964037, 533.98, 540.66, 533.04
05/18/2015, 532.3, 2001117, 532.01, 534.82, 528.85
05/15/2015, 533.85, 1965088, 539.18, 539.2743, 530.38
05/14/2015, 538.4, 1401005, 533.77, 539, 532.41
05/13/2015, 529.62, 1253005, 530.56, 534.3215, 528.655
05/12/2015, 529.04, 1633180, 531.6, 533.2089, 525.26
05/11/2015, 535.7, 904465, 538.37, 541.98, 535.4
05/08/2015, 538.22, 1527181, 536.65, 541.15, 536
05/07/2015, 530.7, 1543986, 523.99, 533.46, 521.75
05/06/2015, 524.22, 1566865, 531.24, 532.38, 521.085
05/05/2015, 530.8, 1380519, 538.21, 539.74, 530.3906
05/04/2015, 540.78, 1303830, 538.53, 544.07, 535.06
05/01/2015, 537.9, 1758085, 538.43, 539.54, 532.1
04/30/2015, 537.34, 2080834, 547.87, 548.59, 535.05
04/29/2015, 549.08, 1696886, 550.47, 553.68, 546.905
04/28/2015, 553.68, 1490735, 554.64, 556.02, 550.366
04/27/2015, 555.37, 2390696, 563.39, 565.95, 553.2001
| 04/24/2020, 1279.31, 1640394, 1261.17, 1280.4, 1249.45
04/23/2020, 1276.31, 1566203, 1271.55, 1293.31, 1265.67
04/22/2020, 1263.21, 2093140, 1245.54, 1285.6133, 1242
04/21/2020, 1216.34, 2153003, 1247, 1254.27, 1209.71
04/20/2020, 1266.61, 1695488, 1271, 1281.6, 1261.37
04/17/2020, 1283.25, 1949042, 1284.85, 1294.43, 1271.23
04/16/2020, 1263.47, 2518099, 1274.1, 1279, 1242.62
04/15/2020, 1262.47, 1671703, 1245.61, 1280.46, 1240.4
04/14/2020, 1269.23, 2470353, 1245.09, 1282.07, 1236.93
04/13/2020, 1217.56, 1739828, 1209.18, 1220.51, 1187.5984
04/09/2020, 1211.45, 2175421, 1224.08, 1225.57, 1196.7351
04/08/2020, 1210.28, 1975135, 1206.5, 1219.07, 1188.16
04/07/2020, 1186.51, 2387329, 1221, 1225, 1182.23
04/06/2020, 1186.92, 2664723, 1138, 1194.66, 1130.94
04/03/2020, 1097.88, 2313400, 1119.015, 1123.54, 1079.81
04/02/2020, 1120.84, 1964881, 1098.26, 1126.86, 1096.4
04/01/2020, 1105.62, 2344173, 1122, 1129.69, 1097.45
03/31/2020, 1162.81, 2487983, 1147.3, 1175.31, 1138.14
03/30/2020, 1146.82, 2574061, 1125.04, 1151.63, 1096.48
03/27/2020, 1110.71, 3208495, 1125.67, 1150.6702, 1105.91
03/26/2020, 1161.75, 3573755, 1111.8, 1169.97, 1093.53
03/25/2020, 1102.49, 4081528, 1126.47, 1148.9, 1086.01
03/24/2020, 1134.46, 3344450, 1103.77, 1135, 1090.62
03/23/2020, 1056.62, 4044137, 1061.32, 1071.32, 1013.5361
03/20/2020, 1072.32, 3601750, 1135.72, 1143.99, 1065.49
03/19/2020, 1115.29, 3651106, 1093.05, 1157.9699, 1060.1075
03/18/2020, 1096.8, 4233435, 1056.51, 1106.5, 1037.28
03/17/2020, 1119.8, 3861489, 1093.11, 1130.86, 1056.01
03/16/2020, 1084.33, 4252365, 1096, 1152.2665, 1074.44
03/13/2020, 1219.73, 3700125, 1179, 1219.76, 1117.1432
03/12/2020, 1114.91, 4226748, 1126, 1193.87, 1113.3
03/11/2020, 1215.41, 2611229, 1249.7, 1260.96, 1196.07
03/10/2020, 1280.39, 2611373, 1260, 1281.15, 1218.77
03/09/2020, 1215.56, 3365365, 1205.3, 1254.7599, 1200
03/06/2020, 1298.41, 2660628, 1277.06, 1306.22, 1261.05
03/05/2020, 1319.04, 2561288, 1350.2, 1358.91, 1305.1
03/04/2020, 1386.52, 1913315, 1359.23, 1388.09, 1343.11
03/03/2020, 1341.39, 2402326, 1399.42, 1410.15, 1332
03/02/2020, 1389.11, 2431468, 1351.61, 1390.87, 1326.815
02/28/2020, 1339.33, 3790618, 1277.5, 1341.14, 1271
02/27/2020, 1318.09, 2978300, 1362.06, 1371.7037, 1317.17
02/26/2020, 1393.18, 2204037, 1396.14, 1415.7, 1379
02/25/2020, 1388.45, 2478278, 1433, 1438.14, 1382.4
02/24/2020, 1421.59, 2867053, 1426.11, 1436.97, 1411.39
02/21/2020, 1485.11, 1732273, 1508.03, 1512.215, 1480.44
02/20/2020, 1518.15, 1096552, 1522, 1529.64, 1506.82
02/19/2020, 1526.69, 949268, 1525.07, 1532.1063, 1521.4
02/18/2020, 1519.67, 1121140, 1515, 1531.63, 1512.59
02/14/2020, 1520.74, 1197836, 1515.6, 1520.74, 1507.34
02/13/2020, 1514.66, 929730, 1512.69, 1527.18, 1504.6
02/12/2020, 1518.27, 1167565, 1514.48, 1520.695, 1508.11
02/11/2020, 1508.79, 1344633, 1511.81, 1529.63, 1505.6378
02/10/2020, 1508.68, 1419876, 1474.32, 1509.5, 1474.32
02/07/2020, 1479.23, 1172270, 1467.3, 1485.84, 1466.35
02/06/2020, 1476.23, 1679384, 1450.33, 1481.9997, 1449.57
02/05/2020, 1448.23, 1986157, 1462.42, 1463.84, 1430.56
02/04/2020, 1447.07, 3932954, 1457.07, 1469.5, 1426.3
02/03/2020, 1485.94, 3055216, 1462, 1490, 1458.99
01/31/2020, 1434.23, 2417214, 1468.9, 1470.13, 1428.53
01/30/2020, 1455.84, 1339421, 1439.96, 1457.28, 1436.4
01/29/2020, 1458.63, 1078667, 1458.8, 1465.43, 1446.74
01/28/2020, 1452.56, 1577422, 1443, 1456, 1432.47
01/27/2020, 1433.9, 1755201, 1431, 1438.07, 1421.2
01/24/2020, 1466.71, 1784644, 1493.59, 1495.495, 1465.25
01/23/2020, 1486.65, 1351354, 1487.64, 1495.52, 1482.1
01/22/2020, 1485.95, 1610846, 1491, 1503.2143, 1484.93
01/21/2020, 1484.4, 2036780, 1479.12, 1491.85, 1471.2
01/17/2020, 1480.39, 2396215, 1462.91, 1481.2954, 1458.22
01/16/2020, 1451.7, 1173688, 1447.44, 1451.99, 1440.92
01/15/2020, 1439.2, 1282685, 1430.21, 1441.395, 1430.21
01/14/2020, 1430.88, 1560453, 1439.01, 1441.8, 1428.37
01/13/2020, 1439.23, 1653482, 1436.13, 1440.52, 1426.02
01/10/2020, 1429.73, 1821566, 1427.56, 1434.9292, 1418.35
01/09/2020, 1419.83, 1502664, 1420.57, 1427.33, 1410.27
01/08/2020, 1404.32, 1529177, 1392.08, 1411.58, 1390.84
01/07/2020, 1393.34, 1511693, 1397.94, 1402.99, 1390.38
01/06/2020, 1394.21, 1733149, 1350, 1396.5, 1350
01/03/2020, 1360.66, 1187006, 1347.86, 1372.5, 1345.5436
01/02/2020, 1367.37, 1406731, 1341.55, 1368.14, 1341.55
12/31/2019, 1337.02, 962468, 1330.11, 1338, 1329.085
12/30/2019, 1336.14, 1051323, 1350, 1353, 1334.02
12/27/2019, 1351.89, 1038718, 1362.99, 1364.53, 1349.31
12/26/2019, 1360.4, 667754, 1346.17, 1361.3269, 1344.47
12/24/2019, 1343.56, 347518, 1348.5, 1350.26, 1342.78
12/23/2019, 1348.84, 883200, 1355.87, 1359.7999, 1346.51
12/20/2019, 1349.59, 3316905, 1363.35, 1363.64, 1349
12/19/2019, 1356.04, 1470112, 1351.82, 1358.1, 1348.985
12/18/2019, 1352.62, 1657069, 1356.6, 1360.47, 1351
12/17/2019, 1355.12, 1855259, 1362.89, 1365, 1351.3231
12/16/2019, 1361.17, 1397451, 1356.5, 1364.68, 1352.67
12/13/2019, 1347.83, 1550028, 1347.95, 1353.0931, 1343.87
12/12/2019, 1350.27, 1281722, 1345.94, 1355.775, 1340.5
12/11/2019, 1345.02, 850796, 1350.84, 1351.2, 1342.67
12/10/2019, 1344.66, 1094653, 1341.5, 1349.975, 1336.04
12/09/2019, 1343.56, 1355795, 1338.04, 1359.45, 1337.84
12/06/2019, 1340.62, 1315510, 1333.44, 1344, 1333.44
12/05/2019, 1328.13, 1212818, 1328, 1329.3579, 1316.44
12/04/2019, 1320.54, 1538110, 1307.01, 1325.8, 1304.87
12/03/2019, 1295.28, 1268647, 1279.57, 1298.461, 1279
12/02/2019, 1289.92, 1511851, 1301, 1305.83, 1281
11/29/2019, 1304.96, 586981, 1307.12, 1310.205, 1303.97
11/27/2019, 1312.99, 996329, 1315, 1318.36, 1309.63
11/26/2019, 1313.55, 1069795, 1309.86, 1314.8, 1305.09
11/25/2019, 1306.69, 1036487, 1299.18, 1311.31, 1298.13
11/22/2019, 1295.34, 1386506, 1305.62, 1308.73, 1291.41
11/21/2019, 1301.35, 995499, 1301.48, 1312.59, 1293
11/20/2019, 1303.05, 1309835, 1311.74, 1315, 1291.15
11/19/2019, 1315.46, 1269372, 1327.7, 1327.7, 1312.8
11/18/2019, 1320.7, 1488083, 1332.22, 1335.5288, 1317.5
11/15/2019, 1334.87, 1782955, 1318.94, 1334.88, 1314.2796
11/14/2019, 1311.46, 1194305, 1297.5, 1317, 1295.65
11/13/2019, 1298, 853861, 1294.07, 1304.3, 1293.51
11/12/2019, 1298.8, 1085859, 1300, 1310, 1295.77
11/11/2019, 1299.19, 1012429, 1303.18, 1306.425, 1297.41
11/08/2019, 1311.37, 1251916, 1305.28, 1318, 1304.365
11/07/2019, 1308.86, 2029970, 1294.28, 1323.74, 1294.245
11/06/2019, 1291.8, 1152977, 1289.46, 1293.73, 1282.5
11/05/2019, 1292.03, 1282711, 1292.89, 1298.93, 1291.2289
11/04/2019, 1291.37, 1500964, 1276.45, 1294.13, 1276.355
11/01/2019, 1273.74, 1670072, 1265, 1274.62, 1260.5
10/31/2019, 1260.11, 1455651, 1261.28, 1267.67, 1250.8428
10/30/2019, 1261.29, 1408851, 1252.97, 1269.36, 1252
10/29/2019, 1262.62, 1886380, 1276.23, 1281.59, 1257.2119
10/28/2019, 1290, 2613237, 1275.45, 1299.31, 1272.54
10/25/2019, 1265.13, 1213051, 1251.03, 1269.6, 1250.01
10/24/2019, 1260.99, 1039868, 1260.9, 1264, 1253.715
10/23/2019, 1259.13, 928595, 1242.36, 1259.89, 1242.36
10/22/2019, 1242.8, 1047851, 1247.85, 1250.6, 1241.38
10/21/2019, 1246.15, 1038042, 1252.26, 1254.6287, 1240.6
10/18/2019, 1245.49, 1352839, 1253.46, 1258.89, 1241.08
10/17/2019, 1253.07, 980510, 1250.93, 1263.325, 1249.94
10/16/2019, 1243.64, 1168174, 1241.17, 1254.74, 1238.45
10/15/2019, 1243.01, 1395259, 1220.4, 1247.33, 1220.4
10/14/2019, 1217.14, 882039, 1212.34, 1226.33, 1211.76
10/11/2019, 1215.45, 1277144, 1222.21, 1228.39, 1213.74
10/10/2019, 1208.67, 932531, 1198.58, 1215, 1197.34
10/09/2019, 1202.31, 876632, 1199.35, 1208.35, 1197.63
10/08/2019, 1189.13, 1141784, 1197.59, 1206.08, 1189.01
10/07/2019, 1207.68, 867149, 1204.4, 1218.2036, 1203.75
10/04/2019, 1209, 1183264, 1191.89, 1211.44, 1189.17
10/03/2019, 1187.83, 1663656, 1180, 1189.06, 1162.43
10/02/2019, 1176.63, 1639237, 1196.98, 1196.99, 1171.29
10/01/2019, 1205.1, 1358279, 1219, 1231.23, 1203.58
09/30/2019, 1219, 1419676, 1220.97, 1226, 1212.3
09/27/2019, 1225.09, 1354432, 1243.01, 1244.02, 1214.45
09/26/2019, 1241.39, 1561882, 1241.96, 1245, 1232.268
09/25/2019, 1246.52, 1593875, 1215.82, 1248.3, 1210.09
09/24/2019, 1218.76, 1591786, 1240, 1246.74, 1210.68
09/23/2019, 1234.03, 1075253, 1226, 1239.09, 1224.17
09/20/2019, 1229.93, 2337269, 1233.12, 1243.32, 1223.08
09/19/2019, 1238.71, 1000155, 1232.06, 1244.44, 1232.02
09/18/2019, 1232.41, 1144333, 1227.51, 1235.61, 1216.53
09/17/2019, 1229.15, 958112, 1230.4, 1235, 1223.69
09/16/2019, 1231.3, 1053299, 1229.52, 1239.56, 1225.61
09/13/2019, 1239.56, 1301350, 1231.35, 1240.88, 1227.01
09/12/2019, 1234.25, 1725908, 1224.3, 1241.86, 1223.02
09/11/2019, 1220.17, 1307033, 1203.41, 1222.6, 1202.2
09/10/2019, 1206, 1260115, 1195.15, 1210, 1194.58
09/09/2019, 1204.41, 1471880, 1204, 1220, 1192.62
09/06/2019, 1204.93, 1072143, 1208.13, 1212.015, 1202.5222
09/05/2019, 1211.38, 1408601, 1191.53, 1213.04, 1191.53
09/04/2019, 1181.41, 1068968, 1176.71, 1183.48, 1171
09/03/2019, 1168.39, 1480420, 1177.03, 1186.89, 1163.2
08/30/2019, 1188.1, 1129959, 1198.5, 1198.5, 1183.8026
08/29/2019, 1192.85, 1088858, 1181.12, 1196.06, 1181.12
08/28/2019, 1171.02, 802243, 1161.71, 1176.4199, 1157.3
08/27/2019, 1167.84, 1077452, 1180.53, 1182.4, 1161.45
08/26/2019, 1168.89, 1226441, 1157.26, 1169.47, 1152.96
08/23/2019, 1151.29, 1688271, 1181.99, 1194.08, 1147.75
08/22/2019, 1189.53, 947906, 1194.07, 1198.0115, 1178.58
08/21/2019, 1191.25, 741053, 1193.15, 1199, 1187.43
08/20/2019, 1182.69, 915605, 1195.25, 1196.06, 1182.11
08/19/2019, 1198.45, 1232517, 1190.09, 1206.99, 1190.09
08/16/2019, 1177.6, 1349436, 1179.55, 1182.72, 1171.81
08/15/2019, 1167.26, 1224739, 1163.5, 1175.84, 1162.11
08/14/2019, 1164.29, 1578668, 1176.31, 1182.3, 1160.54
08/13/2019, 1197.27, 1318009, 1171.46, 1204.78, 1171.46
08/12/2019, 1174.71, 1003187, 1179.21, 1184.96, 1167.6723
08/09/2019, 1188.01, 1065658, 1197.99, 1203.88, 1183.603
08/08/2019, 1204.8, 1467997, 1182.83, 1205.01, 1173.02
08/07/2019, 1173.99, 1444324, 1156, 1178.4451, 1149.6239
08/06/2019, 1169.95, 1709374, 1163.31, 1179.96, 1160
08/05/2019, 1152.32, 2597455, 1170.04, 1175.24, 1140.14
08/02/2019, 1193.99, 1645067, 1200.74, 1206.9, 1188.94
08/01/2019, 1209.01, 1698510, 1214.03, 1234.11, 1205.72
07/31/2019, 1216.68, 1725454, 1223, 1234, 1207.7635
07/30/2019, 1225.14, 1453263, 1225.41, 1234.87, 1223.3
07/29/2019, 1239.41, 2223731, 1241.05, 1247.37, 1228.23
07/26/2019, 1250.41, 4805752, 1224.04, 1265.5499, 1224
07/25/2019, 1132.12, 2209823, 1137.82, 1141.7, 1120.92
07/24/2019, 1137.81, 1590101, 1131.9, 1144, 1126.99
07/23/2019, 1146.21, 1093688, 1144, 1146.9, 1131.8
07/22/2019, 1138.07, 1301846, 1133.45, 1139.25, 1124.24
07/19/2019, 1130.1, 1647245, 1148.19, 1151.14, 1129.62
07/18/2019, 1146.33, 1291281, 1141.74, 1147.605, 1132.73
07/17/2019, 1146.35, 1170047, 1150.97, 1158.36, 1145.77
07/16/2019, 1153.58, 1238807, 1146, 1158.58, 1145
07/15/2019, 1150.34, 903780, 1146.86, 1150.82, 1139.4
07/12/2019, 1144.9, 863973, 1143.99, 1147.34, 1138.78
07/11/2019, 1144.21, 1195569, 1143.25, 1153.07, 1139.58
07/10/2019, 1140.48, 1209466, 1131.22, 1142.05, 1130.97
07/09/2019, 1124.83, 1330370, 1111.8, 1128.025, 1107.17
07/08/2019, 1116.35, 1236419, 1125.17, 1125.98, 1111.21
07/05/2019, 1131.59, 1264540, 1117.8, 1132.88, 1116.14
07/03/2019, 1121.58, 767011, 1117.41, 1126.76, 1113.86
07/02/2019, 1111.25, 991755, 1102.24, 1111.77, 1098.17
07/01/2019, 1097.95, 1438504, 1098, 1107.58, 1093.703
06/28/2019, 1080.91, 1693450, 1076.39, 1081, 1073.37
06/27/2019, 1076.01, 1004477, 1084, 1087.1, 1075.29
06/26/2019, 1079.8, 1810869, 1086.5, 1092.97, 1072.24
06/25/2019, 1086.35, 1546913, 1112.66, 1114.35, 1083.8
06/24/2019, 1115.52, 1395696, 1119.61, 1122, 1111.01
06/21/2019, 1121.88, 1947591, 1109.24, 1124.11, 1108.08
06/20/2019, 1111.42, 1262011, 1119.99, 1120.12, 1104.74
06/19/2019, 1102.33, 1339218, 1105.6, 1107, 1093.48
06/18/2019, 1103.6, 1386684, 1109.69, 1116.39, 1098.99
06/17/2019, 1092.5, 941602, 1086.28, 1099.18, 1086.28
06/14/2019, 1085.35, 1111643, 1086.42, 1092.69, 1080.1721
06/13/2019, 1088.77, 1058000, 1083.64, 1094.17, 1080.15
06/12/2019, 1077.03, 1061255, 1078, 1080.93, 1067.54
06/11/2019, 1078.72, 1437063, 1093.98, 1101.99, 1077.6025
06/10/2019, 1080.38, 1464248, 1072.98, 1092.66, 1072.3216
06/07/2019, 1066.04, 1802370, 1050.63, 1070.92, 1048.4
06/06/2019, 1044.34, 1703244, 1044.99, 1047.49, 1033.7
06/05/2019, 1042.22, 2168439, 1051.54, 1053.55, 1030.49
06/04/2019, 1053.05, 2833483, 1042.9, 1056.05, 1033.69
06/03/2019, 1036.23, 5130576, 1065.5, 1065.5, 1025
05/31/2019, 1103.63, 1508203, 1101.29, 1109.6, 1100.18
05/30/2019, 1117.95, 951873, 1115.54, 1123.13, 1112.12
05/29/2019, 1116.46, 1538212, 1127.52, 1129.1, 1108.2201
05/28/2019, 1134.15, 1365166, 1134, 1151.5871, 1133.12
05/24/2019, 1133.47, 1112341, 1147.36, 1149.765, 1131.66
05/23/2019, 1140.77, 1199300, 1140.5, 1145.9725, 1129.224
05/22/2019, 1151.42, 914839, 1146.75, 1158.52, 1145.89
05/21/2019, 1149.63, 1160158, 1148.49, 1152.7077, 1137.94
05/20/2019, 1138.85, 1353292, 1144.5, 1146.7967, 1131.4425
05/17/2019, 1162.3, 1208623, 1168.47, 1180.15, 1160.01
05/16/2019, 1178.98, 1531404, 1164.51, 1188.16, 1162.84
05/15/2019, 1164.21, 2289302, 1117.87, 1171.33, 1116.6657
05/14/2019, 1120.44, 1836604, 1137.21, 1140.42, 1119.55
05/13/2019, 1132.03, 1860648, 1141.96, 1147.94, 1122.11
05/10/2019, 1164.27, 1314546, 1163.59, 1172.6, 1142.5
05/09/2019, 1162.38, 1185973, 1159.03, 1169.66, 1150.85
05/08/2019, 1166.27, 1309514, 1172.01, 1180.4243, 1165.74
05/07/2019, 1174.1, 1551368, 1180.47, 1190.44, 1161.04
05/06/2019, 1189.39, 1563943, 1166.26, 1190.85, 1166.26
05/03/2019, 1185.4, 1980653, 1173.65, 1186.8, 1169
05/02/2019, 1162.61, 1944817, 1167.76, 1174.1895, 1155.0018
05/01/2019, 1168.08, 2642983, 1188.05, 1188.05, 1167.18
04/30/2019, 1188.48, 6194691, 1185, 1192.81, 1175
04/29/2019, 1287.58, 2412788, 1274, 1289.27, 1266.2949
04/26/2019, 1272.18, 1228276, 1269, 1273.07, 1260.32
04/25/2019, 1263.45, 1099614, 1264.77, 1267.4083, 1252.03
04/24/2019, 1256, 1015006, 1264.12, 1268.01, 1255
04/23/2019, 1264.55, 1271195, 1250.69, 1269, 1246.38
04/22/2019, 1248.84, 806577, 1235.99, 1249.09, 1228.31
04/18/2019, 1236.37, 1315676, 1239.18, 1242, 1234.61
04/17/2019, 1236.34, 1211866, 1233, 1240.56, 1227.82
04/16/2019, 1227.13, 855258, 1225, 1230.82, 1220.12
04/15/2019, 1221.1, 1187353, 1218, 1224.2, 1209.1101
04/12/2019, 1217.87, 926799, 1210, 1218.35, 1208.11
04/11/2019, 1204.62, 709417, 1203.96, 1207.96, 1200.13
04/10/2019, 1202.16, 724524, 1200.68, 1203.785, 1196.435
04/09/2019, 1197.25, 865416, 1196, 1202.29, 1193.08
04/08/2019, 1203.84, 859969, 1207.89, 1208.69, 1199.86
04/05/2019, 1207.15, 900950, 1214.99, 1216.22, 1205.03
04/04/2019, 1215, 949962, 1205.94, 1215.67, 1204.13
04/03/2019, 1205.92, 1014195, 1207.48, 1216.3, 1200.5
04/02/2019, 1200.49, 800820, 1195.32, 1201.35, 1185.71
04/01/2019, 1194.43, 1188235, 1184.1, 1196.66, 1182
03/29/2019, 1173.31, 1269573, 1174.9, 1178.99, 1162.88
03/28/2019, 1168.49, 966843, 1171.54, 1171.565, 1159.4312
03/27/2019, 1173.02, 1362217, 1185.5, 1187.559, 1159.37
03/26/2019, 1184.62, 1894639, 1198.53, 1202.83, 1176.72
03/25/2019, 1193, 1493841, 1196.93, 1206.3975, 1187.04
03/22/2019, 1205.5, 1668910, 1226.32, 1230, 1202.825
03/21/2019, 1231.54, 1195899, 1216, 1231.79, 1213.15
03/20/2019, 1223.97, 2089367, 1197.35, 1227.14, 1196.17
03/19/2019, 1198.85, 1404863, 1188.81, 1200, 1185.87
03/18/2019, 1184.26, 1212506, 1183.3, 1190, 1177.4211
03/15/2019, 1184.46, 2457597, 1193.38, 1196.57, 1182.61
03/14/2019, 1185.55, 1150950, 1194.51, 1197.88, 1184.48
03/13/2019, 1193.32, 1434816, 1200.645, 1200.93, 1191.94
03/12/2019, 1193.2, 2012306, 1178.26, 1200, 1178.26
03/11/2019, 1175.76, 1569332, 1144.45, 1176.19, 1144.45
03/08/2019, 1142.32, 1212271, 1126.73, 1147.08, 1123.3
03/07/2019, 1143.3, 1166076, 1155.72, 1156.755, 1134.91
03/06/2019, 1157.86, 1094100, 1162.49, 1167.5658, 1155.49
03/05/2019, 1162.03, 1422357, 1150.06, 1169.61, 1146.195
03/04/2019, 1147.8, 1444774, 1146.99, 1158.2804, 1130.69
03/01/2019, 1140.99, 1447454, 1124.9, 1142.97, 1124.75
02/28/2019, 1119.92, 1541068, 1111.3, 1127.65, 1111.01
02/27/2019, 1116.05, 968362, 1106.95, 1117.98, 1101
02/26/2019, 1115.13, 1469761, 1105.75, 1119.51, 1099.92
02/25/2019, 1109.4, 1395281, 1116, 1118.54, 1107.27
02/22/2019, 1110.37, 1048361, 1100.9, 1111.24, 1095.6
02/21/2019, 1096.97, 1414744, 1110.84, 1111.94, 1092.52
02/20/2019, 1113.8, 1080144, 1119.99, 1123.41, 1105.28
02/19/2019, 1118.56, 1046315, 1110, 1121.89, 1110
02/15/2019, 1113.65, 1442461, 1130.08, 1131.67, 1110.65
02/14/2019, 1121.67, 941678, 1118.05, 1128.23, 1110.445
02/13/2019, 1120.16, 1048630, 1124.99, 1134.73, 1118.5
02/12/2019, 1121.37, 1608658, 1106.8, 1125.295, 1105.85
02/11/2019, 1095.01, 1063825, 1096.95, 1105.945, 1092.86
02/08/2019, 1095.06, 1072031, 1087, 1098.91, 1086.55
02/07/2019, 1098.71, 2040615, 1104.16, 1104.84, 1086
02/06/2019, 1115.23, 2101674, 1139.57, 1147, 1112.77
02/05/2019, 1145.99, 3529974, 1124.84, 1146.85, 1117.248
02/04/2019, 1132.8, 2518184, 1112.66, 1132.8, 1109.02
02/01/2019, 1110.75, 1455609, 1112.4, 1125, 1104.89
01/31/2019, 1116.37, 1531463, 1103, 1117.33, 1095.41
01/30/2019, 1089.06, 1241760, 1068.43, 1091, 1066.85
01/29/2019, 1060.62, 1006731, 1072.68, 1075.15, 1055.8647
01/28/2019, 1070.08, 1277745, 1080.11, 1083, 1063.8
01/25/2019, 1090.99, 1114785, 1085, 1094, 1081.82
01/24/2019, 1073.9, 1317718, 1076.48, 1079.475, 1060.7
01/23/2019, 1075.57, 956526, 1077.35, 1084.93, 1059.75
01/22/2019, 1070.52, 1607398, 1088, 1091.51, 1063.47
01/18/2019, 1098.26, 1933754, 1100, 1108.352, 1090.9
01/17/2019, 1089.9, 1223674, 1079.47, 1091.8, 1073.5
01/16/2019, 1080.97, 1320530, 1080, 1092.375, 1079.34
01/15/2019, 1077.15, 1452238, 1050.17, 1080.05, 1047.34
01/14/2019, 1044.69, 1127417, 1046.92, 1051.53, 1041.255
01/11/2019, 1057.19, 1512651, 1063.18, 1063.775, 1048.48
01/10/2019, 1070.33, 1444976, 1067.66, 1071.15, 1057.71
01/09/2019, 1074.66, 1198369, 1081.65, 1082.63, 1066.4
01/08/2019, 1076.28, 1748371, 1076.11, 1084.56, 1060.53
01/07/2019, 1068.39, 1978077, 1071.5, 1073.9999, 1054.76
01/04/2019, 1070.71, 2080144, 1032.59, 1070.84, 1027.4179
01/03/2019, 1016.06, 1829379, 1041, 1056.98, 1014.07
01/02/2019, 1045.85, 1516681, 1016.57, 1052.32, 1015.71
12/31/2018, 1035.61, 1492541, 1050.96, 1052.7, 1023.59
12/28/2018, 1037.08, 1399218, 1049.62, 1055.56, 1033.1
12/27/2018, 1043.88, 2102069, 1017.15, 1043.89, 997
12/26/2018, 1039.46, 2337212, 989.01, 1040, 983
12/24/2018, 976.22, 1590328, 973.9, 1003.54, 970.11
12/21/2018, 979.54, 4560424, 1015.3, 1024.02, 973.69
12/20/2018, 1009.41, 2659047, 1018.13, 1034.22, 996.36
12/19/2018, 1023.01, 2419322, 1033.99, 1062, 1008.05
12/18/2018, 1028.71, 2101854, 1026.09, 1049.48, 1021.44
12/17/2018, 1016.53, 2337631, 1037.51, 1053.15, 1007.9
12/14/2018, 1042.1, 1685802, 1049.98, 1062.6, 1040.79
12/13/2018, 1061.9, 1329198, 1068.07, 1079.7597, 1053.93
12/12/2018, 1063.68, 1523276, 1068, 1081.65, 1062.79
12/11/2018, 1051.75, 1354751, 1056.49, 1060.6, 1039.84
12/10/2018, 1039.55, 1793465, 1035.05, 1048.45, 1023.29
12/07/2018, 1036.58, 2098526, 1060.01, 1075.26, 1028.5
12/06/2018, 1068.73, 2758098, 1034.26, 1071.2, 1030.7701
12/04/2018, 1050.82, 2278200, 1103.12, 1104.42, 1049.98
12/03/2018, 1106.43, 1900355, 1123.14, 1124.65, 1103.6645
11/30/2018, 1094.43, 2554416, 1089.07, 1095.57, 1077.88
11/29/2018, 1088.3, 1403540, 1076.08, 1094.245, 1076
11/28/2018, 1086.23, 2399374, 1048.76, 1086.84, 1035.76
11/27/2018, 1044.41, 1801334, 1041, 1057.58, 1038.49
11/26/2018, 1048.62, 1846430, 1038.35, 1049.31, 1033.91
11/23/2018, 1023.88, 691462, 1030, 1037.59, 1022.3992
11/21/2018, 1037.61, 1531676, 1036.76, 1048.56, 1033.47
11/20/2018, 1025.76, 2447254, 1000, 1031.74, 996.02
11/19/2018, 1020, 1837207, 1057.2, 1060.79, 1016.2601
11/16/2018, 1061.49, 1641232, 1059.41, 1067, 1048.98
11/15/2018, 1064.71, 1819132, 1044.71, 1071.85, 1031.78
11/14/2018, 1043.66, 1561656, 1050, 1054.5643, 1031
11/13/2018, 1036.05, 1496534, 1043.29, 1056.605, 1031.15
11/12/2018, 1038.63, 1429319, 1061.39, 1062.12, 1031
11/09/2018, 1066.15, 1343154, 1073.99, 1075.56, 1053.11
11/08/2018, 1082.4, 1463022, 1091.38, 1093.27, 1072.2048
11/07/2018, 1093.39, 2057155, 1069, 1095.46, 1065.9
11/06/2018, 1055.81, 1225197, 1039.48, 1064.345, 1038.07
11/05/2018, 1040.09, 2436742, 1055, 1058.47, 1021.24
11/02/2018, 1057.79, 1829295, 1073.73, 1082.975, 1054.61
11/01/2018, 1070, 1456222, 1075.8, 1083.975, 1062.46
10/31/2018, 1076.77, 2528584, 1059.81, 1091.94, 1057
10/30/2018, 1036.21, 3209126, 1008.46, 1037.49, 1000.75
10/29/2018, 1020.08, 3873644, 1082.47, 1097.04, 995.83
10/26/2018, 1071.47, 4185201, 1037.03, 1106.53, 1034.09
10/25/2018, 1095.57, 2511884, 1071.79, 1110.98, 1069.55
10/24/2018, 1050.71, 1910060, 1104.25, 1106.12, 1048.74
10/23/2018, 1103.69, 1847798, 1080.89, 1107.89, 1070
10/22/2018, 1101.16, 1494285, 1103.06, 1112.23, 1091
10/19/2018, 1096.46, 1264605, 1093.37, 1110.36, 1087.75
10/18/2018, 1087.97, 2056606, 1121.84, 1121.84, 1077.09
10/17/2018, 1115.69, 1397613, 1126.46, 1128.99, 1102.19
10/16/2018, 1121.28, 1845491, 1104.59, 1124.22, 1102.5
10/15/2018, 1092.25, 1343231, 1108.91, 1113.4464, 1089
10/12/2018, 1110.08, 2029872, 1108, 1115, 1086.402
10/11/2018, 1079.32, 2939514, 1072.94, 1106.4, 1068.27
10/10/2018, 1081.22, 2574985, 1131.08, 1132.17, 1081.13
10/09/2018, 1138.82, 1308706, 1146.15, 1154.35, 1137.572
10/08/2018, 1148.97, 1877142, 1150.11, 1168, 1127.3636
10/05/2018, 1157.35, 1184245, 1167.5, 1173.4999, 1145.12
10/04/2018, 1168.19, 2151762, 1195.33, 1197.51, 1155.576
10/03/2018, 1202.95, 1207280, 1205, 1206.41, 1193.83
10/02/2018, 1200.11, 1655602, 1190.96, 1209.96, 1186.63
10/01/2018, 1195.31, 1345250, 1199.89, 1209.9, 1190.3
09/28/2018, 1193.47, 1306822, 1191.87, 1195.41, 1184.5
09/27/2018, 1194.64, 1244278, 1186.73, 1202.1, 1183.63
09/26/2018, 1180.49, 1346434, 1185.15, 1194.23, 1174.765
09/25/2018, 1184.65, 937577, 1176.15, 1186.88, 1168
09/24/2018, 1173.37, 1218532, 1157.17, 1178, 1146.91
09/21/2018, 1166.09, 4363929, 1192, 1192.21, 1166.04
09/20/2018, 1186.87, 1209855, 1179.99, 1189.89, 1173.36
09/19/2018, 1171.09, 1185321, 1164.98, 1173.21, 1154.58
09/18/2018, 1161.22, 1184407, 1157.09, 1176.08, 1157.09
09/17/2018, 1156.05, 1279147, 1170.14, 1177.24, 1154.03
09/14/2018, 1172.53, 934300, 1179.1, 1180.425, 1168.3295
09/13/2018, 1175.33, 1402005, 1170.74, 1178.61, 1162.85
09/12/2018, 1162.82, 1291304, 1172.72, 1178.61, 1158.36
09/11/2018, 1177.36, 1209171, 1161.63, 1178.68, 1156.24
09/10/2018, 1164.64, 1115259, 1172.19, 1174.54, 1160.11
09/07/2018, 1164.83, 1401034, 1158.67, 1175.26, 1157.215
09/06/2018, 1171.44, 1886690, 1186.3, 1186.3, 1152
09/05/2018, 1186.48, 2043732, 1193.8, 1199.0096, 1162
09/04/2018, 1197, 1800509, 1204.27, 1212.99, 1192.5
08/31/2018, 1218.19, 1812366, 1234.98, 1238.66, 1211.2854
08/30/2018, 1239.12, 1320261, 1244.23, 1253.635, 1232.59
08/29/2018, 1249.3, 1295939, 1237.45, 1250.66, 1236.3588
08/28/2018, 1231.15, 1296532, 1241.29, 1242.545, 1228.69
08/27/2018, 1241.82, 1154962, 1227.6, 1243.09, 1225.716
08/24/2018, 1220.65, 946529, 1208.82, 1221.65, 1206.3588
08/23/2018, 1205.38, 988509, 1207.14, 1221.28, 1204.24
08/22/2018, 1207.33, 881463, 1200, 1211.84, 1199
08/21/2018, 1201.62, 1187884, 1208, 1217.26, 1200.3537
08/20/2018, 1207.77, 864462, 1205.02, 1211, 1194.6264
08/17/2018, 1200.96, 1381724, 1202.03, 1209.02, 1188.24
08/16/2018, 1206.49, 1319985, 1224.73, 1225.9999, 1202.55
08/15/2018, 1214.38, 1815642, 1229.26, 1235.24, 1209.51
08/14/2018, 1242.1, 1342534, 1235.19, 1245.8695, 1225.11
08/13/2018, 1235.01, 957153, 1236.98, 1249.2728, 1233.6405
08/10/2018, 1237.61, 1107323, 1243, 1245.695, 1232
08/09/2018, 1249.1, 805227, 1249.9, 1255.542, 1246.01
08/08/2018, 1245.61, 1369650, 1240.47, 1256.5, 1238.0083
08/07/2018, 1242.22, 1493073, 1237, 1251.17, 1236.17
08/06/2018, 1224.77, 1080923, 1225, 1226.0876, 1215.7965
08/03/2018, 1223.71, 1072524, 1229.62, 1230, 1215.06
08/02/2018, 1226.15, 1520488, 1205.9, 1229.88, 1204.79
08/01/2018, 1220.01, 1567142, 1228, 1233.47, 1210.21
07/31/2018, 1217.26, 1632823, 1220.01, 1227.5877, 1205.6
07/30/2018, 1219.74, 1822782, 1228.01, 1234.916, 1211.47
07/27/2018, 1238.5, 2115802, 1271, 1273.89, 1231
07/26/2018, 1268.33, 2334881, 1251, 1269.7707, 1249.02
07/25/2018, 1263.7, 2115890, 1239.13, 1265.86, 1239.13
07/24/2018, 1248.08, 3303268, 1262.59, 1266, 1235.56
07/23/2018, 1205.5, 2584034, 1181.01, 1206.49, 1181
07/20/2018, 1184.91, 1246898, 1186.96, 1196.86, 1184.22
07/19/2018, 1186.96, 1256113, 1191, 1200, 1183.32
07/18/2018, 1195.88, 1391232, 1196.56, 1204.5, 1190.34
07/17/2018, 1198.8, 1585091, 1172.22, 1203.04, 1170.6
07/16/2018, 1183.86, 1049560, 1189.39, 1191, 1179.28
07/13/2018, 1188.82, 1221687, 1185, 1195.4173, 1180
07/12/2018, 1183.48, 1251083, 1159.89, 1184.41, 1155.935
07/11/2018, 1153.9, 1094301, 1144.59, 1164.29, 1141.0003
07/10/2018, 1152.84, 789249, 1156.98, 1159.59, 1149.59
07/09/2018, 1154.05, 906073, 1148.48, 1154.67, 1143.42
07/06/2018, 1140.17, 966155, 1123.58, 1140.93, 1120.7371
07/05/2018, 1124.27, 1060752, 1110.53, 1127.5, 1108.48
07/03/2018, 1102.89, 679034, 1135.82, 1135.82, 1100.02
07/02/2018, 1127.46, 1188616, 1099, 1128, 1093.8
06/29/2018, 1115.65, 1275979, 1120, 1128.2265, 1115
06/28/2018, 1114.22, 1072438, 1102.09, 1122.31, 1096.01
06/27/2018, 1103.98, 1287698, 1121.34, 1131.8362, 1103.62
06/26/2018, 1118.46, 1559791, 1128, 1133.21, 1116.6589
06/25/2018, 1124.81, 2155276, 1143.6, 1143.91, 1112.78
06/22/2018, 1155.48, 1310164, 1159.14, 1162.4965, 1147.26
06/21/2018, 1157.66, 1232352, 1174.85, 1177.295, 1152.232
06/20/2018, 1169.84, 1648248, 1175.31, 1186.2856, 1169.16
06/19/2018, 1168.06, 1616125, 1158.5, 1171.27, 1154.01
06/18/2018, 1173.46, 1400641, 1143.65, 1174.31, 1143.59
06/15/2018, 1152.26, 2119134, 1148.86, 1153.42, 1143.485
06/14/2018, 1152.12, 1350085, 1143.85, 1155.47, 1140.64
06/13/2018, 1134.79, 1490017, 1141.12, 1146.5, 1133.38
06/12/2018, 1139.32, 899231, 1131.07, 1139.79, 1130.735
06/11/2018, 1129.99, 1071114, 1118.6, 1137.26, 1118.6
06/08/2018, 1120.87, 1289859, 1118.18, 1126.67, 1112.15
06/07/2018, 1123.86, 1519860, 1131.32, 1135.82, 1116.52
06/06/2018, 1136.88, 1697489, 1142.17, 1143, 1125.7429
06/05/2018, 1139.66, 1538169, 1140.99, 1145.738, 1133.19
06/04/2018, 1139.29, 1881046, 1122.33, 1141.89, 1122.005
06/01/2018, 1119.5, 2416755, 1099.35, 1120, 1098.5
05/31/2018, 1084.99, 3085325, 1067.56, 1097.19, 1067.56
05/30/2018, 1067.8, 1129958, 1063.03, 1069.21, 1056.83
05/29/2018, 1060.32, 1858676, 1064.89, 1073.37, 1055.22
05/25/2018, 1075.66, 878903, 1079.02, 1082.56, 1073.775
05/24/2018, 1079.24, 757752, 1079, 1080.47, 1066.15
05/23/2018, 1079.69, 1057712, 1065.13, 1080.78, 1061.71
05/22/2018, 1069.73, 1088700, 1083.56, 1086.59, 1066.69
05/21/2018, 1079.58, 1012258, 1074.06, 1088, 1073.65
05/18/2018, 1066.36, 1496448, 1061.86, 1069.94, 1060.68
05/17/2018, 1078.59, 1031190, 1079.89, 1086.87, 1073.5
05/16/2018, 1081.77, 989819, 1077.31, 1089.27, 1076.26
05/15/2018, 1079.23, 1494306, 1090, 1090.05, 1073.47
05/14/2018, 1100.2, 1450140, 1100, 1110.75, 1099.11
05/11/2018, 1098.26, 1253205, 1093.6, 1101.3295, 1090.91
05/10/2018, 1097.57, 1441456, 1086.03, 1100.44, 1085.64
05/09/2018, 1082.76, 2032319, 1058.1, 1085.44, 1056.365
05/08/2018, 1053.91, 1217260, 1058.54, 1060.55, 1047.145
05/07/2018, 1054.79, 1464008, 1049.23, 1061.68, 1047.1
05/04/2018, 1048.21, 1936797, 1016.9, 1048.51, 1016.9
05/03/2018, 1023.72, 1813623, 1019, 1029.675, 1006.29
05/02/2018, 1024.38, 1534094, 1028.1, 1040.389, 1022.87
05/01/2018, 1037.31, 1427171, 1013.66, 1038.47, 1008.21
04/30/2018, 1017.33, 1664084, 1030.01, 1037, 1016.85
04/27/2018, 1030.05, 1617452, 1046, 1049.5, 1025.59
04/26/2018, 1040.04, 1984448, 1029.51, 1047.98, 1018.19
04/25/2018, 1021.18, 2225495, 1025.52, 1032.49, 1015.31
04/24/2018, 1019.98, 4750851, 1052, 1057, 1010.59
04/23/2018, 1067.45, 2278846, 1077.86, 1082.72, 1060.7
04/20/2018, 1072.96, 1887698, 1082, 1092.35, 1069.57
04/19/2018, 1087.7, 1741907, 1069.4, 1094.165, 1068.18
04/18/2018, 1072.08, 1336678, 1077.43, 1077.43, 1066.225
04/17/2018, 1074.16, 2311903, 1051.37, 1077.88, 1048.26
04/16/2018, 1037.98, 1194144, 1037, 1043.24, 1026.74
04/13/2018, 1029.27, 1175754, 1040.88, 1046.42, 1022.98
04/12/2018, 1032.51, 1357599, 1025.04, 1040.69, 1021.4347
04/11/2018, 1019.97, 1476133, 1027.99, 1031.3641, 1015.87
04/10/2018, 1031.64, 1983510, 1026.44, 1036.28, 1011.34
04/09/2018, 1015.45, 1738682, 1016.8, 1039.6, 1014.08
04/06/2018, 1007.04, 1740896, 1020, 1031.42, 1003.03
04/05/2018, 1027.81, 1345681, 1041.33, 1042.79, 1020.1311
04/04/2018, 1025.14, 2464418, 993.41, 1028.7175, 993
04/03/2018, 1013.41, 2271858, 1013.91, 1020.99, 994.07
04/02/2018, 1006.47, 2679214, 1022.82, 1034.8, 990.37
03/29/2018, 1031.79, 2714402, 1011.63, 1043, 1002.9
03/28/2018, 1004.56, 3345046, 998, 1024.23, 980.64
03/27/2018, 1005.1, 3081612, 1063, 1064.8393, 996.92
03/26/2018, 1053.21, 2593808, 1046, 1055.63, 1008.4
03/23/2018, 1021.57, 2147097, 1047.03, 1063.36, 1021.22
03/22/2018, 1049.08, 2584639, 1081.88, 1082.9, 1045.91
03/21/2018, 1090.88, 1878294, 1092.74, 1106.2999, 1085.15
03/20/2018, 1097.71, 1802209, 1099, 1105.2, 1083.46
03/19/2018, 1099.82, 2355186, 1120.01, 1121.99, 1089.01
03/16/2018, 1135.73, 2614871, 1154.14, 1155.88, 1131.96
03/15/2018, 1149.58, 1397767, 1149.96, 1161.08, 1134.54
03/14/2018, 1149.49, 1290638, 1145.21, 1158.59, 1141.44
03/13/2018, 1138.17, 1874176, 1170, 1176.76, 1133.33
03/12/2018, 1164.5, 2106548, 1163.85, 1177.05, 1157.42
03/09/2018, 1160.04, 2121425, 1136, 1160.8, 1132.4606
03/08/2018, 1126, 1393529, 1115.32, 1127.6, 1112.8
03/07/2018, 1109.64, 1277439, 1089.19, 1112.22, 1085.4823
03/06/2018, 1095.06, 1497087, 1099.22, 1101.85, 1089.775
03/05/2018, 1090.93, 1141932, 1075.14, 1097.1, 1069.0001
03/02/2018, 1078.92, 2271394, 1053.08, 1081.9986, 1048.115
03/01/2018, 1069.52, 2511872, 1107.87, 1110.12, 1067.001
02/28/2018, 1104.73, 1873737, 1123.03, 1127.53, 1103.24
02/27/2018, 1118.29, 1772866, 1141.24, 1144.04, 1118
02/26/2018, 1143.75, 1514920, 1127.8, 1143.96, 1126.695
02/23/2018, 1126.79, 1190432, 1112.64, 1127.28, 1104.7135
02/22/2018, 1106.63, 1309536, 1116.19, 1122.82, 1102.59
02/21/2018, 1111.34, 1507152, 1106.47, 1133.97, 1106.33
02/20/2018, 1102.46, 1389491, 1090.57, 1113.95, 1088.52
02/16/2018, 1094.8, 1680283, 1088.41, 1104.67, 1088.3134
02/15/2018, 1089.52, 1785552, 1079.07, 1091.4794, 1064.34
02/14/2018, 1069.7, 1547665, 1048.95, 1071.72, 1046.75
02/13/2018, 1052.1, 1213800, 1045, 1058.37, 1044.0872
02/12/2018, 1051.94, 2054002, 1048, 1061.5, 1040.928
02/09/2018, 1037.78, 3503970, 1017.25, 1043.97, 992.56
02/08/2018, 1001.52, 2809890, 1055.41, 1058.62, 1000.66
02/07/2018, 1048.58, 2353003, 1081.54, 1081.78, 1048.26
02/06/2018, 1080.6, 3432313, 1027.18, 1081.71, 1023.1367
02/05/2018, 1055.8, 3769453, 1090.6, 1110, 1052.03
02/02/2018, 1111.9, 4837979, 1122, 1123.07, 1107.2779
02/01/2018, 1167.7, 2380221, 1162.61, 1174, 1157.52
01/31/2018, 1169.94, 1523820, 1170.57, 1173, 1159.13
01/30/2018, 1163.69, 1541771, 1167.83, 1176.52, 1163.52
01/29/2018, 1175.58, 1337324, 1176.48, 1186.89, 1171.98
01/26/2018, 1175.84, 1981173, 1175.08, 1175.84, 1158.11
01/25/2018, 1170.37, 1461518, 1172.53, 1175.94, 1162.76
01/24/2018, 1164.24, 1382904, 1177.33, 1179.86, 1161.05
01/23/2018, 1169.97, 1309862, 1159.85, 1171.6266, 1158.75
01/22/2018, 1155.81, 1616120, 1137.49, 1159.88, 1135.1101
01/19/2018, 1137.51, 1390118, 1131.83, 1137.86, 1128.3
01/18/2018, 1129.79, 1194943, 1131.41, 1132.51, 1117.5
01/17/2018, 1131.98, 1200476, 1126.22, 1132.6, 1117.01
01/16/2018, 1121.76, 1566662, 1132.51, 1139.91, 1117.8316
01/12/2018, 1122.26, 1718491, 1102.41, 1124.29, 1101.15
01/11/2018, 1105.52, 977727, 1106.3, 1106.525, 1099.59
01/10/2018, 1102.61, 1042273, 1097.1, 1104.6, 1096.11
01/09/2018, 1106.26, 900089, 1109.4, 1110.57, 1101.2307
01/08/2018, 1106.94, 1046767, 1102.23, 1111.27, 1101.62
01/05/2018, 1102.23, 1279990, 1094, 1104.25, 1092
01/04/2018, 1086.4, 1002945, 1088, 1093.5699, 1084.0017
01/03/2018, 1082.48, 1429757, 1064.31, 1086.29, 1063.21
01/02/2018, 1065, 1236401, 1048.34, 1066.94, 1045.23
12/29/2017, 1046.4, 886845, 1046.72, 1049.7, 1044.9
12/28/2017, 1048.14, 833011, 1051.6, 1054.75, 1044.77
12/27/2017, 1049.37, 1271780, 1057.39, 1058.37, 1048.05
12/26/2017, 1056.74, 761097, 1058.07, 1060.12, 1050.2
12/22/2017, 1060.12, 755089, 1061.11, 1064.2, 1059.44
12/21/2017, 1063.63, 986548, 1064.95, 1069.33, 1061.7938
12/20/2017, 1064.95, 1268285, 1071.78, 1073.38, 1061.52
12/19/2017, 1070.68, 1307894, 1075.2, 1076.84, 1063.55
12/18/2017, 1077.14, 1552016, 1066.08, 1078.49, 1062
12/15/2017, 1064.19, 3275091, 1054.61, 1067.62, 1049.5
12/14/2017, 1049.15, 1558684, 1045, 1058.5, 1043.11
12/13/2017, 1040.61, 1220364, 1046.12, 1046.665, 1038.38
12/12/2017, 1040.48, 1279511, 1039.63, 1050.31, 1033.6897
12/11/2017, 1041.1, 1190527, 1035.5, 1043.8, 1032.0504
12/08/2017, 1037.05, 1288419, 1037.49, 1042.05, 1032.5222
12/07/2017, 1030.93, 1458145, 1020.43, 1034.24, 1018.071
12/06/2017, 1018.38, 1258496, 1001.5, 1024.97, 1001.14
12/05/2017, 1005.15, 2066247, 995.94, 1020.61, 988.28
12/04/2017, 998.68, 1906058, 1012.66, 1016.1, 995.57
12/01/2017, 1010.17, 1908962, 1015.8, 1022.4897, 1002.02
11/30/2017, 1021.41, 1723003, 1022.37, 1028.4899, 1015
11/29/2017, 1021.66, 2442974, 1042.68, 1044.08, 1015.65
11/28/2017, 1047.41, 1421027, 1055.09, 1062.375, 1040
11/27/2017, 1054.21, 1307471, 1040, 1055.46, 1038.44
11/24/2017, 1040.61, 536996, 1035.87, 1043.178, 1035
11/22/2017, 1035.96, 746351, 1035, 1039.706, 1031.43
11/21/2017, 1034.49, 1096161, 1023.31, 1035.11, 1022.655
11/20/2017, 1018.38, 898389, 1020.26, 1022.61, 1017.5
11/17/2017, 1019.09, 1366936, 1034.01, 1034.42, 1017.75
11/16/2017, 1032.5, 1129424, 1022.52, 1035.92, 1022.52
11/15/2017, 1020.91, 847932, 1019.21, 1024.09, 1015.42
11/14/2017, 1026, 958708, 1022.59, 1026.81, 1014.15
11/13/2017, 1025.75, 885565, 1023.42, 1031.58, 1022.57
11/10/2017, 1028.07, 720674, 1026.46, 1030.76, 1025.28
11/09/2017, 1031.26, 1244701, 1033.99, 1033.99, 1019.6656
11/08/2017, 1039.85, 1088395, 1030.52, 1043.522, 1028.45
11/07/2017, 1033.33, 1112123, 1027.27, 1033.97, 1025.13
11/06/2017, 1025.9, 1124757, 1028.99, 1034.87, 1025
11/03/2017, 1032.48, 1075134, 1022.11, 1032.65, 1020.31
11/02/2017, 1025.58, 1048584, 1021.76, 1028.09, 1013.01
11/01/2017, 1025.5, 1371619, 1017.21, 1029.67, 1016.95
10/31/2017, 1016.64, 1331265, 1015.22, 1024, 1010.42
10/30/2017, 1017.11, 2083490, 1014, 1024.97, 1007.5
10/27/2017, 1019.27, 5165922, 1009.19, 1048.39, 1008.2
10/26/2017, 972.56, 2027218, 980, 987.6, 972.2
10/25/2017, 973.33, 1210368, 968.37, 976.09, 960.5201
10/24/2017, 970.54, 1206074, 970, 972.23, 961
10/23/2017, 968.45, 1471544, 989.52, 989.52, 966.12
10/20/2017, 988.2, 1176177, 989.44, 991, 984.58
10/19/2017, 984.45, 1312706, 986, 988.88, 978.39
10/18/2017, 992.81, 1057285, 991.77, 996.72, 986.9747
10/17/2017, 992.18, 1290152, 990.29, 996.44, 988.59
10/16/2017, 992, 910246, 992.1, 993.9065, 984
10/13/2017, 989.68, 1169584, 992, 997.21, 989
10/12/2017, 987.83, 1278357, 987.45, 994.12, 985
10/11/2017, 989.25, 1692843, 973.72, 990.71, 972.25
10/10/2017, 972.6, 968113, 980, 981.57, 966.0801
10/09/2017, 977, 890620, 980, 985.425, 976.11
10/06/2017, 978.89, 1146207, 966.7, 979.46, 963.36
10/05/2017, 969.96, 1210427, 955.49, 970.91, 955.18
10/04/2017, 951.68, 951766, 957, 960.39, 950.69
10/03/2017, 957.79, 888303, 954, 958, 949.14
10/02/2017, 953.27, 1282850, 959.98, 962.54, 947.84
09/29/2017, 959.11, 1576365, 952, 959.7864, 951.51
09/28/2017, 949.5, 997036, 941.36, 950.69, 940.55
09/27/2017, 944.49, 2237538, 927.74, 949.9, 927.74
09/26/2017, 924.86, 1666749, 923.72, 930.82, 921.14
09/25/2017, 920.97, 1855742, 925.45, 926.4, 909.7
09/22/2017, 928.53, 1052170, 927.75, 934.73, 926.48
09/21/2017, 932.45, 1227059, 933, 936.53, 923.83
09/20/2017, 931.58, 1535626, 922.98, 933.88, 922
09/19/2017, 921.81, 912967, 917.42, 922.4199, 912.55
09/18/2017, 915, 1300759, 920.01, 922.08, 910.6
09/15/2017, 920.29, 2499466, 924.66, 926.49, 916.36
09/14/2017, 925.11, 1395497, 931.25, 932.77, 924
09/13/2017, 935.09, 1101145, 930.66, 937.25, 929.86
09/12/2017, 932.07, 1133638, 932.59, 933.48, 923.861
09/11/2017, 929.08, 1266020, 934.25, 938.38, 926.92
09/08/2017, 926.5, 997699, 936.49, 936.99, 924.88
09/07/2017, 935.95, 1211472, 931.73, 936.41, 923.62
09/06/2017, 927.81, 1526209, 930.15, 930.915, 919.27
09/05/2017, 928.45, 1346791, 933.08, 937, 921.96
09/01/2017, 937.34, 943657, 941.13, 942.48, 935.15
08/31/2017, 939.33, 1566888, 931.76, 941.98, 931.76
08/30/2017, 929.57, 1300616, 920.05, 930.819, 919.65
08/29/2017, 921.29, 1181391, 905.1, 923.33, 905
08/28/2017, 913.81, 1085014, 916, 919.245, 911.87
08/25/2017, 915.89, 1052764, 923.49, 925.555, 915.5
08/24/2017, 921.28, 1266191, 928.66, 930.84, 915.5
08/23/2017, 927, 1088575, 921.93, 929.93, 919.36
08/22/2017, 924.69, 1166320, 912.72, 925.86, 911.4751
08/21/2017, 906.66, 942328, 910, 913, 903.4
08/18/2017, 910.67, 1341990, 910.31, 915.275, 907.1543
08/17/2017, 910.98, 1241782, 925.78, 926.86, 910.98
08/16/2017, 926.96, 1005261, 925.29, 932.7, 923.445
08/15/2017, 922.22, 882479, 924.23, 926.5499, 919.82
08/14/2017, 922.67, 1063404, 922.53, 924.668, 918.19
08/11/2017, 914.39, 1205652, 907.97, 917.78, 905.58
08/10/2017, 907.24, 1755521, 917.55, 919.26, 906.13
08/09/2017, 922.9, 1191332, 920.61, 925.98, 917.2501
08/08/2017, 926.79, 1057351, 927.09, 935.814, 925.6095
08/07/2017, 929.36, 1031710, 929.06, 931.7, 926.5
08/04/2017, 927.96, 1081814, 926.75, 930.3068, 923.03
08/03/2017, 923.65, 1201519, 930.34, 932.24, 922.24
08/02/2017, 930.39, 1822272, 928.61, 932.6, 916.68
08/01/2017, 930.83, 1234612, 932.38, 937.447, 929.26
07/31/2017, 930.5, 1964748, 941.89, 943.59, 926.04
07/28/2017, 941.53, 1802343, 929.4, 943.83, 927.5
07/27/2017, 934.09, 3128819, 951.78, 951.78, 920
07/26/2017, 947.8, 2069349, 954.68, 955, 942.2788
07/25/2017, 950.7, 4656609, 953.81, 959.7, 945.4
07/24/2017, 980.34, 3205374, 972.22, 986.2, 970.77
07/21/2017, 972.92, 1697190, 962.25, 973.23, 960.15
07/20/2017, 968.15, 1620636, 975, 975.9, 961.51
07/19/2017, 970.89, 1221155, 967.84, 973.04, 964.03
07/18/2017, 965.4, 1152741, 953, 968.04, 950.6
07/17/2017, 953.42, 1164141, 957, 960.74, 949.2407
07/14/2017, 955.99, 1052855, 952, 956.91, 948.005
07/13/2017, 947.16, 1294674, 946.29, 954.45, 943.01
07/12/2017, 943.83, 1517168, 938.68, 946.3, 934.47
07/11/2017, 930.09, 1112417, 929.54, 931.43, 922
07/10/2017, 928.8, 1190237, 921.77, 930.38, 919.59
07/07/2017, 918.59, 1590456, 908.85, 921.54, 908.85
07/06/2017, 906.69, 1424290, 904.12, 914.9444, 899.7
07/05/2017, 911.71, 1813309, 901.76, 914.51, 898.5
07/03/2017, 898.7, 1710373, 912.18, 913.94, 894.79
06/30/2017, 908.73, 2086340, 926.05, 926.05, 908.31
06/29/2017, 917.79, 3287991, 929.92, 931.26, 910.62
06/28/2017, 940.49, 2719213, 929, 942.75, 916
06/27/2017, 927.33, 2566047, 942.46, 948.29, 926.85
06/26/2017, 952.27, 1596664, 969.9, 973.31, 950.79
06/23/2017, 965.59, 1527513, 956.83, 966, 954.2
06/22/2017, 957.09, 941639, 958.7, 960.72, 954.55
06/21/2017, 959.45, 1201971, 953.64, 960.1, 950.76
06/20/2017, 950.63, 1125520, 957.52, 961.62, 950.01
06/19/2017, 957.37, 1520715, 949.96, 959.99, 949.05
06/16/2017, 939.78, 3061794, 940, 942.04, 931.595
06/15/2017, 942.31, 2065271, 933.97, 943.339, 924.44
06/14/2017, 950.76, 1487378, 959.92, 961.15, 942.25
06/13/2017, 953.4, 2012980, 951.91, 959.98, 944.09
06/12/2017, 942.9, 3762434, 939.56, 949.355, 915.2328
06/09/2017, 949.83, 3305545, 984.5, 984.5, 935.63
06/08/2017, 983.41, 1477151, 982.35, 984.57, 977.2
06/07/2017, 981.08, 1447172, 979.65, 984.15, 975.77
06/06/2017, 976.57, 1814323, 983.16, 988.25, 975.14
06/05/2017, 983.68, 1251903, 976.55, 986.91, 975.1
06/02/2017, 975.6, 1750723, 969.46, 975.88, 966
06/01/2017, 966.95, 1408958, 968.95, 971.5, 960.01
05/31/2017, 964.86, 2447176, 975.02, 979.27, 960.18
05/30/2017, 975.88, 1466288, 970.31, 976.2, 969.49
05/26/2017, 971.47, 1251425, 969.7, 974.98, 965.03
05/25/2017, 969.54, 1659422, 957.33, 972.629, 955.47
05/24/2017, 954.96, 1031408, 952.98, 955.09, 949.5
05/23/2017, 948.82, 1269438, 947.92, 951.4666, 942.575
05/22/2017, 941.86, 1118456, 935, 941.8828, 935
05/19/2017, 934.01, 1389848, 931.47, 937.755, 931
05/18/2017, 930.24, 1596058, 921, 933.17, 918.75
05/17/2017, 919.62, 2357922, 935.67, 939.3325, 918.14
05/16/2017, 943, 968288, 940, 943.11, 937.58
05/15/2017, 937.08, 1104595, 932.95, 938.25, 929.34
05/12/2017, 932.22, 1050377, 931.53, 933.44, 927.85
05/11/2017, 930.6, 834997, 925.32, 932.53, 923.0301
05/10/2017, 928.78, 1173887, 931.98, 932, 925.16
05/09/2017, 932.17, 1581236, 936.95, 937.5, 929.53
05/08/2017, 934.3, 1328885, 926.12, 936.925, 925.26
05/05/2017, 927.13, 1910317, 933.54, 934.9, 925.2
05/04/2017, 931.66, 1421938, 926.07, 935.93, 924.59
05/03/2017, 927.04, 1497565, 914.86, 928.1, 912.5426
05/02/2017, 916.44, 1543696, 909.62, 920.77, 909.4526
05/01/2017, 912.57, 2114629, 901.94, 915.68, 901.45
04/28/2017, 905.96, 3223850, 910.66, 916.85, 905.77
04/27/2017, 874.25, 2009509, 873.6, 875.4, 870.38
04/26/2017, 871.73, 1233724, 874.23, 876.05, 867.7481
04/25/2017, 872.3, 1670095, 865, 875, 862.81
04/24/2017, 862.76, 1371722, 851.2, 863.45, 849.86
04/21/2017, 843.19, 1323364, 842.88, 843.88, 840.6
04/20/2017, 841.65, 957994, 841.44, 845.2, 839.32
04/19/2017, 838.21, 954324, 839.79, 842.22, 836.29
04/18/2017, 836.82, 835433, 834.22, 838.93, 832.71
04/17/2017, 837.17, 894540, 825.01, 837.75, 824.47
04/13/2017, 823.56, 1118221, 822.14, 826.38, 821.44
04/12/2017, 824.32, 900059, 821.93, 826.66, 821.02
04/11/2017, 823.35, 1078951, 824.71, 827.4267, 817.0201
04/10/2017, 824.73, 978825, 825.39, 829.35, 823.77
04/07/2017, 824.67, 1056692, 827.96, 828.485, 820.5127
04/06/2017, 827.88, 1254235, 832.4, 836.39, 826.46
04/05/2017, 831.41, 1553163, 835.51, 842.45, 830.72
04/04/2017, 834.57, 1044455, 831.36, 835.18, 829.0363
04/03/2017, 838.55, 1670349, 829.22, 840.85, 829.22
03/31/2017, 829.56, 1401756, 828.97, 831.64, 827.39
03/30/2017, 831.5, 1055263, 833.5, 833.68, 829
03/29/2017, 831.41, 1785006, 825, 832.765, 822.3801
03/28/2017, 820.92, 1620532, 820.41, 825.99, 814.027
03/27/2017, 819.51, 1894735, 806.95, 821.63, 803.37
03/24/2017, 814.43, 1980415, 820.08, 821.93, 808.89
03/23/2017, 817.58, 3485390, 821, 822.57, 812.257
03/22/2017, 829.59, 1399409, 831.91, 835.55, 827.1801
03/21/2017, 830.46, 2461375, 851.4, 853.5, 829.02
03/20/2017, 848.4, 1217560, 850.01, 850.22, 845.15
03/17/2017, 852.12, 1712397, 851.61, 853.4, 847.11
03/16/2017, 848.78, 977384, 849.03, 850.85, 846.13
03/15/2017, 847.2, 1381328, 847.59, 848.63, 840.77
03/14/2017, 845.62, 779920, 843.64, 847.24, 840.8
03/13/2017, 845.54, 1149928, 844, 848.685, 843.25
03/10/2017, 843.25, 1702731, 843.28, 844.91, 839.5
03/09/2017, 838.68, 1261393, 836, 842, 834.21
03/08/2017, 835.37, 988900, 833.51, 838.15, 831.79
03/07/2017, 831.91, 1037573, 827.4, 833.41, 826.52
03/06/2017, 827.78, 1108799, 826.95, 828.88, 822.4
03/03/2017, 829.08, 890640, 830.56, 831.36, 825.751
03/02/2017, 830.63, 937824, 833.85, 834.51, 829.64
03/01/2017, 835.24, 1495934, 828.85, 836.255, 827.26
02/28/2017, 823.21, 2258695, 825.61, 828.54, 820.2
02/27/2017, 829.28, 1101120, 824.55, 830.5, 824
02/24/2017, 828.64, 1392039, 827.73, 829, 824.2
02/23/2017, 831.33, 1471342, 830.12, 832.46, 822.88
02/22/2017, 830.76, 983058, 828.66, 833.25, 828.64
02/21/2017, 831.66, 1259841, 828.66, 833.45, 828.35
02/17/2017, 828.07, 1602549, 823.02, 828.07, 821.655
02/16/2017, 824.16, 1285919, 819.93, 824.4, 818.98
02/15/2017, 818.98, 1311316, 819.36, 823, 818.47
02/14/2017, 820.45, 1054472, 819, 823, 816
02/13/2017, 819.24, 1205835, 816, 820.959, 815.49
02/10/2017, 813.67, 1134701, 811.7, 815.25, 809.78
02/09/2017, 809.56, 990260, 809.51, 810.66, 804.54
02/08/2017, 808.38, 1155892, 807, 811.84, 803.1903
02/07/2017, 806.97, 1240257, 803.99, 810.5, 801.78
02/06/2017, 801.34, 1182882, 799.7, 801.67, 795.2501
02/03/2017, 801.49, 1461217, 802.99, 806, 800.37
02/02/2017, 798.53, 1530827, 793.8, 802.7, 792
02/01/2017, 795.695, 2027708, 799.68, 801.19, 791.19
01/31/2017, 796.79, 2153957, 796.86, 801.25, 790.52
01/30/2017, 802.32, 3243568, 814.66, 815.84, 799.8
01/27/2017, 823.31, 2964989, 834.71, 841.95, 820.44
01/26/2017, 832.15, 2944642, 837.81, 838, 827.01
01/25/2017, 835.67, 1612854, 829.62, 835.77, 825.06
01/24/2017, 823.87, 1472228, 822.3, 825.9, 817.821
01/23/2017, 819.31, 1962506, 807.25, 820.87, 803.74
01/20/2017, 805.02, 1668638, 806.91, 806.91, 801.69
01/19/2017, 802.175, 917085, 805.12, 809.48, 801.8
01/18/2017, 806.07, 1293893, 805.81, 806.205, 800.99
01/17/2017, 804.61, 1361935, 807.08, 807.14, 800.37
01/13/2017, 807.88, 1098154, 807.48, 811.2244, 806.69
01/12/2017, 806.36, 1352872, 807.14, 807.39, 799.17
01/11/2017, 807.91, 1065360, 805, 808.15, 801.37
01/10/2017, 804.79, 1176637, 807.86, 809.1299, 803.51
01/09/2017, 806.65, 1274318, 806.4, 809.9664, 802.83
01/06/2017, 806.15, 1639246, 795.26, 807.9, 792.2041
01/05/2017, 794.02, 1334028, 786.08, 794.48, 785.02
01/04/2017, 786.9, 1071198, 788.36, 791.34, 783.16
01/03/2017, 786.14, 1657291, 778.81, 789.63, 775.8
12/30/2016, 771.82, 1769809, 782.75, 782.78, 770.41
12/29/2016, 782.79, 743808, 783.33, 785.93, 778.92
12/28/2016, 785.05, 1142148, 793.7, 794.23, 783.2
12/27/2016, 791.55, 789151, 790.68, 797.86, 787.657
12/23/2016, 789.91, 623682, 790.9, 792.74, 787.28
12/22/2016, 791.26, 972147, 792.36, 793.32, 788.58
12/21/2016, 794.56, 1208770, 795.84, 796.6757, 787.1
12/20/2016, 796.42, 950345, 796.76, 798.65, 793.27
12/19/2016, 794.2, 1231966, 790.22, 797.66, 786.27
12/16/2016, 790.8, 2435100, 800.4, 800.8558, 790.29
12/15/2016, 797.85, 1623709, 797.34, 803, 792.92
12/14/2016, 797.07, 1700875, 797.4, 804, 794.01
12/13/2016, 796.1, 2122735, 793.9, 804.3799, 793.34
12/12/2016, 789.27, 2102288, 785.04, 791.25, 784.3554
12/09/2016, 789.29, 1821146, 780, 789.43, 779.021
12/08/2016, 776.42, 1487517, 772.48, 778.18, 767.23
12/07/2016, 771.19, 1757710, 761, 771.36, 755.8
12/06/2016, 759.11, 1690365, 764.73, 768.83, 757.34
12/05/2016, 762.52, 1393566, 757.71, 763.9, 752.9
12/02/2016, 750.5, 1452181, 744.59, 754, 743.1
12/01/2016, 747.92, 3017001, 757.44, 759.85, 737.0245
11/30/2016, 758.04, 2386628, 770.07, 772.99, 754.83
11/29/2016, 770.84, 1616427, 771.53, 778.5, 768.24
11/28/2016, 768.24, 2177039, 760, 779.53, 759.8
11/25/2016, 761.68, 587421, 764.26, 765, 760.52
11/23/2016, 760.99, 1477501, 767.73, 768.2825, 755.25
11/22/2016, 768.27, 1592372, 772.63, 776.96, 767
11/21/2016, 769.2, 1324431, 762.61, 769.7, 760.6
11/18/2016, 760.54, 1528555, 771.37, 775, 760
11/17/2016, 771.23, 1298484, 766.92, 772.7, 764.23
11/16/2016, 764.48, 1468196, 755.2, 766.36, 750.51
11/15/2016, 758.49, 2375056, 746.97, 764.4162, 746.97
11/14/2016, 736.08, 3644965, 755.6, 757.85, 727.54
11/11/2016, 754.02, 2421889, 756.54, 760.78, 750.38
11/10/2016, 762.56, 4733916, 791.17, 791.17, 752.18
11/09/2016, 785.31, 2603860, 779.94, 791.2265, 771.67
11/08/2016, 790.51, 1361472, 783.4, 795.633, 780.19
11/07/2016, 782.52, 1574426, 774.5, 785.19, 772.55
11/04/2016, 762.02, 2131948, 750.66, 770.36, 750.5611
11/03/2016, 762.13, 1933937, 767.25, 769.95, 759.03
11/02/2016, 768.7, 1905814, 778.2, 781.65, 763.4496
11/01/2016, 783.61, 2404898, 782.89, 789.49, 775.54
10/31/2016, 784.54, 2420892, 795.47, 796.86, 784
10/28/2016, 795.37, 4261912, 808.35, 815.49, 793.59
10/27/2016, 795.35, 2723097, 801, 803.49, 791.5
10/26/2016, 799.07, 1645403, 806.34, 806.98, 796.32
10/25/2016, 807.67, 1575020, 816.68, 816.68, 805.14
10/24/2016, 813.11, 1693162, 804.9, 815.18, 804.82
10/21/2016, 799.37, 1262042, 795, 799.5, 794
10/20/2016, 796.97, 1755546, 803.3, 803.97, 796.03
10/19/2016, 801.56, 1762990, 798.86, 804.63, 797.635
10/18/2016, 795.26, 2046338, 787.85, 801.61, 785.565
10/17/2016, 779.96, 1091524, 779.8, 785.85, 777.5
10/14/2016, 778.53, 851512, 781.65, 783.95, 776
10/13/2016, 778.19, 1360619, 781.22, 781.22, 773
10/12/2016, 786.14, 935138, 783.76, 788.13, 782.06
10/11/2016, 783.07, 1371461, 786.66, 792.28, 780.58
10/10/2016, 785.94, 1161410, 777.71, 789.38, 775.87
10/07/2016, 775.08, 932444, 779.66, 779.66, 770.75
10/06/2016, 776.86, 1066910, 779, 780.48, 775.54
10/05/2016, 776.47, 1457661, 779.31, 782.07, 775.65
10/04/2016, 776.43, 1198361, 776.03, 778.71, 772.89
10/03/2016, 772.56, 1276614, 774.25, 776.065, 769.5
09/30/2016, 777.29, 1583293, 776.33, 780.94, 774.09
09/29/2016, 775.01, 1310252, 781.44, 785.8, 774.232
09/28/2016, 781.56, 1108249, 777.85, 781.81, 774.97
09/27/2016, 783.01, 1152760, 775.5, 785.9899, 774.308
09/26/2016, 774.21, 1531788, 782.74, 782.74, 773.07
09/23/2016, 786.9, 1411439, 786.59, 788.93, 784.15
09/22/2016, 787.21, 1483899, 780, 789.85, 778.44
09/21/2016, 776.22, 1166290, 772.66, 777.16, 768.301
09/20/2016, 771.41, 975434, 769, 773.33, 768.53
09/19/2016, 765.7, 1171969, 772.42, 774, 764.4406
09/16/2016, 768.88, 2047036, 769.75, 769.75, 764.66
09/15/2016, 771.76, 1344945, 762.89, 773.8, 759.96
09/14/2016, 762.49, 1093723, 759.61, 767.68, 759.11
09/13/2016, 759.69, 1394158, 764.48, 766.2195, 755.8
09/12/2016, 769.02, 1310493, 755.13, 770.29, 754.0001
09/09/2016, 759.66, 1879903, 770.1, 773.245, 759.66
09/08/2016, 775.32, 1268663, 778.59, 780.35, 773.58
09/07/2016, 780.35, 893874, 780, 782.73, 776.2
09/06/2016, 780.08, 1441864, 773.45, 782, 771
09/02/2016, 771.46, 1070725, 773.01, 773.9199, 768.41
09/01/2016, 768.78, 925019, 769.25, 771.02, 764.3
08/31/2016, 767.05, 1247937, 767.01, 769.09, 765.38
08/30/2016, 769.09, 1129932, 769.33, 774.466, 766.84
08/29/2016, 772.15, 847537, 768.74, 774.99, 766.615
08/26/2016, 769.54, 1164713, 769, 776.0799, 765.85
08/25/2016, 769.41, 926856, 767, 771.89, 763.1846
08/24/2016, 769.64, 1071569, 770.58, 774.5, 767.07
08/23/2016, 772.08, 925356, 775.48, 776.44, 771.785
08/22/2016, 772.15, 950417, 773.27, 774.54, 770.0502
08/19/2016, 775.42, 860899, 775, 777.1, 773.13
08/18/2016, 777.5, 718882, 780.01, 782.86, 777
08/17/2016, 779.91, 921666, 777.32, 780.81, 773.53
08/16/2016, 777.14, 1027836, 780.3, 780.98, 773.444
08/15/2016, 782.44, 938183, 783.75, 787.49, 780.11
08/12/2016, 783.22, 739761, 781.5, 783.395, 780.4
08/11/2016, 784.85, 971742, 785, 789.75, 782.97
08/10/2016, 784.68, 784559, 783.75, 786.8123, 782.778
08/09/2016, 784.26, 1318457, 781.1, 788.94, 780.57
08/08/2016, 781.76, 1106693, 782, 782.63, 778.091
08/05/2016, 782.22, 1799478, 773.78, 783.04, 772.34
08/04/2016, 771.61, 1139972, 772.22, 774.07, 768.795
08/03/2016, 773.18, 1283186, 767.18, 773.21, 766.82
08/02/2016, 771.07, 1782822, 768.69, 775.84, 767.85
08/01/2016, 772.88, 2697699, 761.09, 780.43, 761.09
07/29/2016, 768.79, 3830103, 772.71, 778.55, 766.77
07/28/2016, 745.91, 3473040, 747.04, 748.65, 739.3
07/27/2016, 741.77, 1509133, 738.28, 744.46, 737
07/26/2016, 738.42, 1182993, 739.04, 741.69, 734.27
07/25/2016, 739.77, 1031643, 740.67, 742.61, 737.5
07/22/2016, 742.74, 1256741, 741.86, 743.24, 736.56
07/21/2016, 738.63, 1022229, 740.36, 741.69, 735.831
07/20/2016, 741.19, 1283931, 737.33, 742.13, 737.1
07/19/2016, 736.96, 1225467, 729.89, 736.99, 729
07/18/2016, 733.78, 1284740, 722.71, 736.13, 721.19
07/15/2016, 719.85, 1277514, 725.73, 725.74, 719.055
07/14/2016, 720.95, 949456, 721.58, 722.21, 718.03
07/13/2016, 716.98, 933352, 723.62, 724, 716.85
07/12/2016, 720.64, 1336112, 719.12, 722.94, 715.91
07/11/2016, 715.09, 1107039, 708.05, 716.51, 707.24
07/08/2016, 705.63, 1573909, 699.5, 705.71, 696.435
07/07/2016, 695.36, 1303661, 698.08, 698.2, 688.215
07/06/2016, 697.77, 1411080, 689.98, 701.68, 689.09
07/05/2016, 694.49, 1462879, 696.06, 696.94, 688.88
07/01/2016, 699.21, 1344387, 692.2, 700.65, 692.1301
06/30/2016, 692.1, 1597298, 685.47, 692.32, 683.65
06/29/2016, 684.11, 1931436, 683, 687.4292, 681.41
06/28/2016, 680.04, 2169704, 678.97, 680.33, 673
06/27/2016, 668.26, 2632011, 671, 672.3, 663.284
06/24/2016, 675.22, 4442943, 675.17, 689.4, 673.45
06/23/2016, 701.87, 2166183, 697.45, 701.95, 687
06/22/2016, 697.46, 1182161, 699.06, 700.86, 693.0819
06/21/2016, 695.94, 1464836, 698.4, 702.77, 692.01
06/20/2016, 693.71, 2080645, 698.77, 702.48, 693.41
06/17/2016, 691.72, 3397720, 708.65, 708.82, 688.4515
06/16/2016, 710.36, 1981657, 714.91, 716.65, 703.26
06/15/2016, 718.92, 1213386, 719, 722.98, 717.31
06/14/2016, 718.27, 1303808, 716.48, 722.47, 713.12
06/13/2016, 718.36, 1255199, 716.51, 725.44, 716.51
06/10/2016, 719.41, 1213989, 719.47, 725.89, 716.43
06/09/2016, 728.58, 987635, 722.87, 729.54, 722.3361
06/08/2016, 728.28, 1583325, 723.96, 728.57, 720.58
06/07/2016, 716.65, 1336348, 719.84, 721.98, 716.55
06/06/2016, 716.55, 1565955, 724.91, 724.91, 714.61
06/03/2016, 722.34, 1225924, 729.27, 729.49, 720.56
06/02/2016, 730.4, 1340664, 732.5, 733.02, 724.17
06/01/2016, 734.15, 1251468, 734.53, 737.21, 730.66
05/31/2016, 735.72, 2128358, 731.74, 739.73, 731.26
05/27/2016, 732.66, 1974425, 724.01, 733.936, 724
05/26/2016, 724.12, 1573635, 722.87, 728.33, 720.28
05/25/2016, 725.27, 1629790, 720.76, 727.51, 719.7047
05/24/2016, 720.09, 1926828, 706.86, 720.97, 706.86
05/23/2016, 704.24, 1326386, 706.53, 711.4781, 704.18
05/20/2016, 709.74, 1825830, 701.62, 714.58, 700.52
05/19/2016, 700.32, 1668887, 702.36, 706, 696.8
05/18/2016, 706.63, 1765632, 703.67, 711.6, 700.63
05/17/2016, 706.23, 1999883, 715.99, 721.52, 704.11
05/16/2016, 716.49, 1316719, 709.13, 718.48, 705.65
05/13/2016, 710.83, 1307559, 711.93, 716.6619, 709.26
05/12/2016, 713.31, 1361170, 717.06, 719.25, 709
05/11/2016, 715.29, 1690862, 723.41, 724.48, 712.8
05/10/2016, 723.18, 1568621, 716.75, 723.5, 715.72
05/09/2016, 712.9, 1509892, 712, 718.71, 710
05/06/2016, 711.12, 1828508, 698.38, 711.86, 698.1067
05/05/2016, 701.43, 1680220, 697.7, 702.3199, 695.72
05/04/2016, 695.7, 1692757, 690.49, 699.75, 689.01
05/03/2016, 692.36, 1541297, 696.87, 697.84, 692
05/02/2016, 698.21, 1645013, 697.63, 700.64, 691
04/29/2016, 693.01, 2486584, 690.7, 697.62, 689
04/28/2016, 691.02, 2859790, 708.26, 714.17, 689.55
04/27/2016, 705.84, 3094905, 707.29, 708.98, 692.3651
04/26/2016, 708.14, 2739133, 725.42, 725.766, 703.0264
04/25/2016, 723.15, 1956956, 716.1, 723.93, 715.59
04/22/2016, 718.77, 5949699, 726.3, 736.12, 713.61
04/21/2016, 759.14, 2995094, 755.38, 760.45, 749.55
04/20/2016, 752.67, 1526776, 758, 758.1315, 750.01
04/19/2016, 753.93, 2027962, 769.51, 769.9, 749.33
04/18/2016, 766.61, 1557199, 760.46, 768.05, 757.3
04/15/2016, 759, 1807062, 753.98, 761, 752.6938
04/14/2016, 753.2, 1134056, 754.01, 757.31, 752.705
04/13/2016, 751.72, 1707397, 749.16, 754.38, 744.261
04/12/2016, 743.09, 1349780, 738, 743.83, 731.01
04/11/2016, 736.1, 1218789, 743.02, 745, 736.05
04/08/2016, 739.15, 1289869, 743.97, 745.45, 735.55
04/07/2016, 740.28, 1452369, 745.37, 746.9999, 736.28
04/06/2016, 745.69, 1052171, 735.77, 746.24, 735.56
04/05/2016, 737.8, 1130817, 738, 742.8, 735.37
04/04/2016, 745.29, 1134214, 750.06, 752.8, 742.43
04/01/2016, 749.91, 1576240, 738.6, 750.34, 737
03/31/2016, 744.95, 1718638, 749.25, 750.85, 740.94
03/30/2016, 750.53, 1782278, 750.1, 757.88, 748.74
03/29/2016, 744.77, 1902254, 734.59, 747.25, 728.76
03/28/2016, 733.53, 1300817, 736.79, 738.99, 732.5
03/24/2016, 735.3, 1570474, 732.01, 737.747, 731
03/23/2016, 738.06, 1431130, 742.36, 745.7199, 736.15
03/22/2016, 740.75, 1269263, 737.46, 745, 737.46
03/21/2016, 742.09, 1835963, 736.5, 742.5, 733.5157
03/18/2016, 737.6, 2982194, 741.86, 742, 731.83
03/17/2016, 737.78, 1859562, 736.45, 743.07, 736
03/16/2016, 736.09, 1621412, 726.37, 737.47, 724.51
03/15/2016, 728.33, 1720790, 726.92, 732.29, 724.77
03/14/2016, 730.49, 1717002, 726.81, 735.5, 725.15
03/11/2016, 726.82, 1968164, 720, 726.92, 717.125
03/10/2016, 712.82, 2830630, 708.12, 716.44, 703.36
03/09/2016, 705.24, 1419661, 698.47, 705.68, 694
03/08/2016, 693.97, 2075305, 688.59, 703.79, 685.34
03/07/2016, 695.16, 2986064, 706.9, 708.0912, 686.9
03/04/2016, 710.89, 1971379, 714.99, 716.49, 706.02
03/03/2016, 712.42, 1956958, 718.68, 719.45, 706.02
03/02/2016, 718.85, 1629501, 719, 720, 712
03/01/2016, 718.81, 2148608, 703.62, 718.81, 699.77
02/29/2016, 697.77, 2478214, 700.32, 710.89, 697.68
02/26/2016, 705.07, 2241785, 708.58, 713.43, 700.86
02/25/2016, 705.75, 1640430, 700.01, 705.98, 690.585
02/24/2016, 699.56, 1961258, 688.92, 700, 680.78
02/23/2016, 695.85, 2006572, 701.45, 708.4, 693.58
02/22/2016, 706.46, 1949046, 707.45, 713.24, 702.51
02/19/2016, 700.91, 1585152, 695.03, 703.0805, 694.05
02/18/2016, 697.35, 1880306, 710, 712.35, 696.03
02/17/2016, 708.4, 2490021, 699, 709.75, 691.38
02/16/2016, 691, 2517324, 692.98, 698, 685.05
02/12/2016, 682.4, 2138937, 690.26, 693.75, 678.6
02/11/2016, 683.11, 3021587, 675, 689.35, 668.8675
02/10/2016, 684.12, 2629130, 686.86, 701.31, 682.13
02/09/2016, 678.11, 3605792, 672.32, 699.9, 668.77
02/08/2016, 682.74, 4241416, 667.85, 684.03, 663.06
02/05/2016, 683.57, 5098357, 703.87, 703.99, 680.15
02/04/2016, 708.01, 5157988, 722.81, 727, 701.86
02/03/2016, 726.95, 6166731, 770.22, 774.5, 720.5
02/02/2016, 764.65, 6340548, 784.5, 789.8699, 764.65
02/01/2016, 752, 5065235, 750.46, 757.86, 743.27
01/29/2016, 742.95, 3464432, 731.53, 744.9899, 726.8
01/28/2016, 730.96, 2664956, 722.22, 733.69, 712.35
01/27/2016, 699.99, 2175913, 713.67, 718.235, 694.39
01/26/2016, 713.04, 1329141, 713.85, 718.28, 706.48
01/25/2016, 711.67, 1709777, 723.58, 729.68, 710.01
01/22/2016, 725.25, 2009951, 723.6, 728.13, 720.121
01/21/2016, 706.59, 2411079, 702.18, 719.19, 694.46
01/20/2016, 698.45, 3441642, 688.61, 706.85, 673.26
01/19/2016, 701.79, 2264747, 703.3, 709.98, 693.4101
01/15/2016, 694.45, 3604137, 692.29, 706.74, 685.37
01/14/2016, 714.72, 2225495, 705.38, 721.925, 689.1
01/13/2016, 700.56, 2497086, 730.85, 734.74, 698.61
01/12/2016, 726.07, 2010026, 721.68, 728.75, 717.3165
01/11/2016, 716.03, 2089495, 716.61, 718.855, 703.54
01/08/2016, 714.47, 2449420, 731.45, 733.23, 713
01/07/2016, 726.39, 2960578, 730.31, 738.5, 719.06
01/06/2016, 743.62, 1943685, 730, 747.18, 728.92
01/05/2016, 742.58, 1949386, 746.45, 752, 738.64
01/04/2016, 741.84, 3271348, 743, 744.06, 731.2577
12/31/2015, 758.88, 1500129, 769.5, 769.5, 758.34
12/30/2015, 771, 1293514, 776.6, 777.6, 766.9
12/29/2015, 776.6, 1764044, 766.69, 779.98, 766.43
12/28/2015, 762.51, 1515574, 752.92, 762.99, 749.52
12/24/2015, 748.4, 527223, 749.55, 751.35, 746.62
12/23/2015, 750.31, 1566723, 753.47, 754.21, 744
12/22/2015, 750, 1365420, 751.65, 754.85, 745.53
12/21/2015, 747.77, 1524535, 746.13, 750, 740
12/18/2015, 739.31, 3140906, 746.51, 754.13, 738.15
12/17/2015, 749.43, 1551087, 762.42, 762.68, 749
12/16/2015, 758.09, 1986319, 750, 760.59, 739.435
12/15/2015, 743.4, 2661199, 753, 758.08, 743.01
12/14/2015, 747.77, 2417778, 741.79, 748.73, 724.17
12/11/2015, 738.87, 2223284, 741.16, 745.71, 736.75
12/10/2015, 749.46, 1988035, 752.85, 755.85, 743.83
12/09/2015, 751.61, 2697978, 759.17, 764.23, 737.001
12/08/2015, 762.37, 1829004, 757.89, 764.8, 754.2
12/07/2015, 763.25, 1811336, 767.77, 768.73, 755.09
12/04/2015, 766.81, 2756194, 753.1, 768.49, 750
12/03/2015, 752.54, 2589641, 766.01, 768.995, 745.63
12/02/2015, 762.38, 2196721, 768.9, 775.955, 758.96
12/01/2015, 767.04, 2131827, 747.11, 768.95, 746.7
11/30/2015, 742.6, 2045584, 748.81, 754.93, 741.27
11/27/2015, 750.26, 838528, 748.46, 753.41, 747.49
11/25/2015, 748.15, 1122224, 748.14, 752, 746.06
11/24/2015, 748.28, 2333700, 752, 755.279, 737.63
11/23/2015, 755.98, 1414640, 757.45, 762.7075, 751.82
11/20/2015, 756.6, 2212934, 746.53, 757.92, 743
11/19/2015, 738.41, 1327265, 738.74, 742, 737.43
11/18/2015, 740, 1683978, 727.58, 741.41, 727
11/17/2015, 725.3, 1507449, 729.29, 731.845, 723.027
11/16/2015, 728.96, 1904395, 715.6, 729.49, 711.33
11/13/2015, 717, 2072392, 729.17, 731.15, 716.73
11/12/2015, 731.23, 1836567, 731, 737.8, 728.645
11/11/2015, 735.4, 1366611, 732.46, 741, 730.23
11/10/2015, 728.32, 1606499, 724.4, 730.59, 718.5001
11/09/2015, 724.89, 2068920, 730.2, 734.71, 719.43
11/06/2015, 733.76, 1510586, 731.5, 735.41, 727.01
11/05/2015, 731.25, 1861100, 729.47, 739.48, 729.47
11/04/2015, 728.11, 1705745, 722, 733.1, 721.9
11/03/2015, 722.16, 1565355, 718.86, 724.65, 714.72
11/02/2015, 721.11, 1885155, 711.06, 721.62, 705.85
10/30/2015, 710.81, 1907732, 715.73, 718, 710.05
10/29/2015, 716.92, 1455508, 710.5, 718.26, 710.01
10/28/2015, 712.95, 2178841, 707.33, 712.98, 703.08
10/27/2015, 708.49, 2232183, 707.38, 713.62, 704.55
10/26/2015, 712.78, 2709292, 701.55, 719.15, 701.26
10/23/2015, 702, 6651909, 727.5, 730, 701.5
10/22/2015, 651.79, 3994360, 646.7, 657.8, 644.01
10/21/2015, 642.61, 1792869, 654.15, 655.87, 641.73
10/20/2015, 650.28, 2498077, 664.04, 664.7197, 644.195
10/19/2015, 666.1, 1465691, 661.18, 666.82, 659.58
10/16/2015, 662.2, 1610712, 664.11, 664.97, 657.2
10/15/2015, 661.74, 1832832, 654.66, 663.13, 654.46
10/14/2015, 651.16, 1413798, 653.21, 659.39, 648.85
10/13/2015, 652.3, 1806003, 643.15, 657.8125, 643.15
10/12/2015, 646.67, 1275565, 642.09, 648.5, 639.01
10/09/2015, 643.61, 1648656, 640, 645.99, 635.318
10/08/2015, 639.16, 2181990, 641.36, 644.45, 625.56
10/07/2015, 642.36, 2092536, 649.24, 650.609, 632.15
10/06/2015, 645.44, 2235078, 638.84, 649.25, 636.5295
10/05/2015, 641.47, 1802263, 632, 643.01, 627
10/02/2015, 626.91, 2681241, 607.2, 627.34, 603.13
10/01/2015, 611.29, 1866223, 608.37, 612.09, 599.85
09/30/2015, 608.42, 2412754, 603.28, 608.76, 600.73
09/29/2015, 594.97, 2310065, 597.28, 605, 590.22
09/28/2015, 594.89, 3118693, 610.34, 614.605, 589.38
09/25/2015, 611.97, 2173134, 629.77, 629.77, 611
09/24/2015, 625.8, 2238097, 616.64, 627.32, 612.4
09/23/2015, 622.36, 1470633, 622.05, 628.93, 620
09/22/2015, 622.69, 2561551, 627, 627.55, 615.43
09/21/2015, 635.44, 1786543, 634.4, 636.49, 625.94
09/18/2015, 629.25, 5123314, 636.79, 640, 627.02
09/17/2015, 642.9, 2259404, 637.79, 650.9, 635.02
09/16/2015, 635.98, 1276250, 635.47, 637.95, 632.32
09/15/2015, 635.14, 2082426, 626.7, 638.7, 623.78
09/14/2015, 623.24, 1701618, 625.7, 625.86, 619.43
09/11/2015, 625.77, 1372803, 619.75, 625.78, 617.42
09/10/2015, 621.35, 1903334, 613.1, 624.16, 611.43
09/09/2015, 612.72, 1699686, 621.22, 626.52, 609.6
09/08/2015, 614.66, 2277487, 612.49, 616.31, 604.12
09/04/2015, 600.7, 2087028, 600, 603.47, 595.25
09/03/2015, 606.25, 1757851, 617, 619.71, 602.8213
09/02/2015, 614.34, 2573982, 605.59, 614.34, 599.71
09/01/2015, 597.79, 3699844, 602.36, 612.86, 594.1
08/31/2015, 618.25, 2172168, 627.54, 635.8, 617.68
08/28/2015, 630.38, 1975818, 632.82, 636.88, 624.56
08/27/2015, 637.61, 3485906, 639.4, 643.59, 622
08/26/2015, 628.62, 4187276, 610.35, 631.71, 599.05
08/25/2015, 582.06, 3521916, 614.91, 617.45, 581.11
08/24/2015, 589.61, 5727282, 573, 614, 565.05
08/21/2015, 612.48, 4261666, 639.78, 640.05, 612.33
08/20/2015, 646.83, 2854028, 655.46, 662.99, 642.9
08/19/2015, 660.9, 2132265, 656.6, 667, 654.19
08/18/2015, 656.13, 1455664, 661.9, 664, 653.46
08/17/2015, 660.87, 1050553, 656.8, 661.38, 651.24
08/14/2015, 657.12, 1071333, 655.01, 659.855, 652.66
08/13/2015, 656.45, 1807182, 659.323, 664.5, 651.661
08/12/2015, 659.56, 2938651, 663.08, 665, 652.29
08/11/2015, 660.78, 5016425, 669.2, 674.9, 654.27
08/10/2015, 633.73, 1653836, 639.48, 643.44, 631.249
08/07/2015, 635.3, 1403441, 640.23, 642.68, 629.71
08/06/2015, 642.68, 1572150, 645, 645.379, 632.25
08/05/2015, 643.78, 2331720, 634.33, 647.86, 633.16
08/04/2015, 629.25, 1486858, 628.42, 634.81, 627.16
08/03/2015, 631.21, 1301439, 625.34, 633.0556, 625.34
07/31/2015, 625.61, 1705286, 631.38, 632.91, 625.5
07/30/2015, 632.59, 1472286, 630, 635.22, 622.05
07/29/2015, 631.93, 1573146, 628.8, 633.36, 622.65
07/28/2015, 628, 1713684, 632.83, 632.83, 623.31
07/27/2015, 627.26, 2673801, 621, 634.3, 620.5
07/24/2015, 623.56, 3622089, 647, 648.17, 622.52
07/23/2015, 644.28, 3014035, 661.27, 663.63, 641
07/22/2015, 662.1, 3707818, 660.89, 678.64, 659
07/21/2015, 662.3, 3363342, 655.21, 673, 654.3
07/20/2015, 663.02, 5857092, 659.24, 668.88, 653.01
07/17/2015, 672.93, 11153500, 649, 674.468, 645
07/16/2015, 579.85, 4559712, 565.12, 580.68, 565
07/15/2015, 560.22, 1782264, 560.13, 566.5029, 556.79
07/14/2015, 561.1, 3231284, 546.76, 565.8487, 546.71
07/13/2015, 546.55, 2204610, 532.88, 547.11, 532.4001
07/10/2015, 530.13, 1954951, 526.29, 532.56, 525.55
07/09/2015, 520.68, 1840155, 523.12, 523.77, 520.35
07/08/2015, 516.83, 1293372, 521.05, 522.734, 516.11
07/07/2015, 525.02, 1595672, 523.13, 526.18, 515.18
07/06/2015, 522.86, 1278587, 519.5, 525.25, 519
07/02/2015, 523.4, 1235773, 521.08, 524.65, 521.08
07/01/2015, 521.84, 1961197, 524.73, 525.69, 518.2305
06/30/2015, 520.51, 2234284, 526.02, 526.25, 520.5
06/29/2015, 521.52, 1935361, 525.01, 528.61, 520.54
06/26/2015, 531.69, 2108629, 537.26, 537.76, 531.35
06/25/2015, 535.23, 1332412, 538.87, 540.9, 535.23
06/24/2015, 537.84, 1286576, 540, 540, 535.66
06/23/2015, 540.48, 1196115, 539.64, 541.499, 535.25
06/22/2015, 538.19, 1243535, 539.59, 543.74, 537.53
06/19/2015, 536.69, 1890916, 537.21, 538.25, 533.01
06/18/2015, 536.73, 1832450, 531, 538.15, 530.79
06/17/2015, 529.26, 1269113, 529.37, 530.98, 525.1
06/16/2015, 528.15, 1071728, 528.4, 529.6399, 525.56
06/15/2015, 527.2, 1632675, 528, 528.3, 524
06/12/2015, 532.33, 955489, 531.6, 533.12, 530.16
06/11/2015, 534.61, 1208632, 538.425, 538.98, 533.02
06/10/2015, 536.69, 1813775, 529.36, 538.36, 529.35
06/09/2015, 526.69, 1454172, 527.56, 529.2, 523.01
06/08/2015, 526.83, 1523960, 533.31, 534.12, 526.24
06/05/2015, 533.33, 1375008, 536.35, 537.2, 532.52
06/04/2015, 536.7, 1346044, 537.76, 540.59, 534.32
06/03/2015, 540.31, 1716836, 539.91, 543.5, 537.11
06/02/2015, 539.18, 1936721, 532.93, 543, 531.33
06/01/2015, 533.99, 1900257, 536.79, 536.79, 529.76
05/29/2015, 532.11, 2590445, 537.37, 538.63, 531.45
05/28/2015, 539.78, 1029764, 538.01, 540.61, 536.25
05/27/2015, 539.79, 1524783, 532.8, 540.55, 531.71
05/26/2015, 532.32, 2404462, 538.12, 539, 529.88
05/22/2015, 540.11, 1175065, 540.15, 544.19, 539.51
05/21/2015, 542.51, 1461431, 537.95, 543.8399, 535.98
05/20/2015, 539.27, 1430565, 538.49, 542.92, 532.972
05/19/2015, 537.36, 1964037, 533.98, 540.66, 533.04
05/18/2015, 532.3, 2001117, 532.01, 534.82, 528.85
05/15/2015, 533.85, 1965088, 539.18, 539.2743, 530.38
05/14/2015, 538.4, 1401005, 533.77, 539, 532.41
05/13/2015, 529.62, 1253005, 530.56, 534.3215, 528.655
05/12/2015, 529.04, 1633180, 531.6, 533.2089, 525.26
05/11/2015, 535.7, 904465, 538.37, 541.98, 535.4
05/08/2015, 538.22, 1527181, 536.65, 541.15, 536
05/07/2015, 530.7, 1543986, 523.99, 533.46, 521.75
05/06/2015, 524.22, 1566865, 531.24, 532.38, 521.085
05/05/2015, 530.8, 1380519, 538.21, 539.74, 530.3906
05/04/2015, 540.78, 1303830, 538.53, 544.07, 535.06
05/01/2015, 537.9, 1758085, 538.43, 539.54, 532.1
04/30/2015, 537.34, 2080834, 547.87, 548.59, 535.05
04/29/2015, 549.08, 1696886, 550.47, 553.68, 546.905
04/28/2015, 553.68, 1490735, 554.64, 556.02, 550.366
04/27/2015, 555.37, 2390696, 563.39, 565.95, 553.2001
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| def get_reverse_bit_string(number: int) -> str:
"""
return the bit string of an integer
>>> get_reverse_bit_string(9)
'10010000000000000000000000000000'
>>> get_reverse_bit_string(43)
'11010100000000000000000000000000'
>>> get_reverse_bit_string(2873)
'10011100110100000000000000000000'
>>> get_reverse_bit_string("this is not a number")
Traceback (most recent call last):
...
TypeError: operation can not be conducted on a object of type str
"""
if not isinstance(number, int):
raise TypeError(
"operation can not be conducted on a object of type "
f"{type(number).__name__}"
)
bit_string = ""
for _ in range(0, 32):
bit_string += str(number % 2)
number = number >> 1
return bit_string
def reverse_bit(number: int) -> str:
"""
Take in an 32 bit integer, reverse its bits,
return a string of reverse bits
result of a reverse_bit and operation on the integer provided.
>>> reverse_bit(25)
'00000000000000000000000000011001'
>>> reverse_bit(37)
'00000000000000000000000000100101'
>>> reverse_bit(21)
'00000000000000000000000000010101'
>>> reverse_bit(58)
'00000000000000000000000000111010'
>>> reverse_bit(0)
'00000000000000000000000000000000'
>>> reverse_bit(256)
'00000000000000000000000100000000'
>>> reverse_bit(-1)
Traceback (most recent call last):
...
ValueError: the value of input must be positive
>>> reverse_bit(1.1)
Traceback (most recent call last):
...
TypeError: Input value must be a 'int' type
>>> reverse_bit("0")
Traceback (most recent call last):
...
TypeError: '<' not supported between instances of 'str' and 'int'
"""
if number < 0:
raise ValueError("the value of input must be positive")
elif isinstance(number, float):
raise TypeError("Input value must be a 'int' type")
elif isinstance(number, str):
raise TypeError("'<' not supported between instances of 'str' and 'int'")
result = 0
# iterator over [1 to 32],since we are dealing with 32 bit integer
for _ in range(1, 33):
# left shift the bits by unity
result = result << 1
# get the end bit
end_bit = number % 2
# right shift the bits by unity
number = number >> 1
# add that bit to our ans
result = result | end_bit
return get_reverse_bit_string(result)
if __name__ == "__main__":
import doctest
doctest.testmod()
| def get_reverse_bit_string(number: int) -> str:
"""
return the bit string of an integer
>>> get_reverse_bit_string(9)
'10010000000000000000000000000000'
>>> get_reverse_bit_string(43)
'11010100000000000000000000000000'
>>> get_reverse_bit_string(2873)
'10011100110100000000000000000000'
>>> get_reverse_bit_string("this is not a number")
Traceback (most recent call last):
...
TypeError: operation can not be conducted on a object of type str
"""
if not isinstance(number, int):
raise TypeError(
"operation can not be conducted on a object of type "
f"{type(number).__name__}"
)
bit_string = ""
for _ in range(0, 32):
bit_string += str(number % 2)
number = number >> 1
return bit_string
def reverse_bit(number: int) -> str:
"""
Take in an 32 bit integer, reverse its bits,
return a string of reverse bits
result of a reverse_bit and operation on the integer provided.
>>> reverse_bit(25)
'00000000000000000000000000011001'
>>> reverse_bit(37)
'00000000000000000000000000100101'
>>> reverse_bit(21)
'00000000000000000000000000010101'
>>> reverse_bit(58)
'00000000000000000000000000111010'
>>> reverse_bit(0)
'00000000000000000000000000000000'
>>> reverse_bit(256)
'00000000000000000000000100000000'
>>> reverse_bit(-1)
Traceback (most recent call last):
...
ValueError: the value of input must be positive
>>> reverse_bit(1.1)
Traceback (most recent call last):
...
TypeError: Input value must be a 'int' type
>>> reverse_bit("0")
Traceback (most recent call last):
...
TypeError: '<' not supported between instances of 'str' and 'int'
"""
if number < 0:
raise ValueError("the value of input must be positive")
elif isinstance(number, float):
raise TypeError("Input value must be a 'int' type")
elif isinstance(number, str):
raise TypeError("'<' not supported between instances of 'str' and 'int'")
result = 0
# iterator over [1 to 32],since we are dealing with 32 bit integer
for _ in range(1, 33):
# left shift the bits by unity
result = result << 1
# get the end bit
end_bit = number % 2
# right shift the bits by unity
number = number >> 1
# add that bit to our ans
result = result | end_bit
return get_reverse_bit_string(result)
if __name__ == "__main__":
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Project Euler Problem 5: https://projecteuler.net/problem=5
Smallest multiple
2520 is the smallest number that can be divided by each of the numbers
from 1 to 10 without any remainder.
What is the smallest positive number that is _evenly divisible_ by all
of the numbers from 1 to 20?
References:
- https://en.wiktionary.org/wiki/evenly_divisible
"""
def solution(n: int = 20) -> int:
"""
Returns the smallest positive number that is evenly divisible (divisible
with no remainder) by all of the numbers from 1 to n.
>>> solution(10)
2520
>>> solution(15)
360360
>>> solution(22)
232792560
>>> solution(3.4)
6
>>> solution(0)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution(-17)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution([])
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
>>> solution("asd")
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
"""
try:
n = int(n)
except (TypeError, ValueError):
raise TypeError("Parameter n must be int or castable to int.")
if n <= 0:
raise ValueError("Parameter n must be greater than or equal to one.")
i = 0
while 1:
i += n * (n - 1)
nfound = 0
for j in range(2, n):
if i % j != 0:
nfound = 1
break
if nfound == 0:
if i == 0:
i = 1
return i
if __name__ == "__main__":
print(f"{solution() = }")
| """
Project Euler Problem 5: https://projecteuler.net/problem=5
Smallest multiple
2520 is the smallest number that can be divided by each of the numbers
from 1 to 10 without any remainder.
What is the smallest positive number that is _evenly divisible_ by all
of the numbers from 1 to 20?
References:
- https://en.wiktionary.org/wiki/evenly_divisible
"""
def solution(n: int = 20) -> int:
"""
Returns the smallest positive number that is evenly divisible (divisible
with no remainder) by all of the numbers from 1 to n.
>>> solution(10)
2520
>>> solution(15)
360360
>>> solution(22)
232792560
>>> solution(3.4)
6
>>> solution(0)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution(-17)
Traceback (most recent call last):
...
ValueError: Parameter n must be greater than or equal to one.
>>> solution([])
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
>>> solution("asd")
Traceback (most recent call last):
...
TypeError: Parameter n must be int or castable to int.
"""
try:
n = int(n)
except (TypeError, ValueError):
raise TypeError("Parameter n must be int or castable to int.")
if n <= 0:
raise ValueError("Parameter n must be greater than or equal to one.")
i = 0
while 1:
i += n * (n - 1)
nfound = 0
for j in range(2, n):
if i % j != 0:
nfound = 1
break
if nfound == 0:
if i == 0:
i = 1
return i
if __name__ == "__main__":
print(f"{solution() = }")
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| from __future__ import annotations
import csv
import requests
from bs4 import BeautifulSoup
def get_imdb_top_250_movies(url: str = "") -> dict[str, float]:
url = url or "https://www.imdb.com/chart/top/?ref_=nv_mv_250"
soup = BeautifulSoup(requests.get(url).text, "html.parser")
titles = soup.find_all("td", attrs="titleColumn")
ratings = soup.find_all("td", class_="ratingColumn imdbRating")
return {
title.a.text: float(rating.strong.text)
for title, rating in zip(titles, ratings)
}
def write_movies(filename: str = "IMDb_Top_250_Movies.csv") -> None:
movies = get_imdb_top_250_movies()
with open(filename, "w", newline="") as out_file:
writer = csv.writer(out_file)
writer.writerow(["Movie title", "IMDb rating"])
for title, rating in movies.items():
writer.writerow([title, rating])
if __name__ == "__main__":
write_movies()
| from __future__ import annotations
import csv
import requests
from bs4 import BeautifulSoup
def get_imdb_top_250_movies(url: str = "") -> dict[str, float]:
url = url or "https://www.imdb.com/chart/top/?ref_=nv_mv_250"
soup = BeautifulSoup(requests.get(url).text, "html.parser")
titles = soup.find_all("td", attrs="titleColumn")
ratings = soup.find_all("td", class_="ratingColumn imdbRating")
return {
title.a.text: float(rating.strong.text)
for title, rating in zip(titles, ratings)
}
def write_movies(filename: str = "IMDb_Top_250_Movies.csv") -> None:
movies = get_imdb_top_250_movies()
with open(filename, "w", newline="") as out_file:
writer = csv.writer(out_file)
writer.writerow(["Movie title", "IMDb rating"])
for title, rating in movies.items():
writer.writerow([title, rating])
if __name__ == "__main__":
write_movies()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
The nqueens problem is of placing N queens on a N * N
chess board such that no queen can attack any other queens placed
on that chess board.
This means that one queen cannot have any other queen on its horizontal, vertical and
diagonal lines.
"""
from __future__ import annotations
solution = []
def isSafe(board: list[list[int]], row: int, column: int) -> bool:
"""
This function returns a boolean value True if it is safe to place a queen there
considering the current state of the board.
Parameters :
board(2D matrix) : board
row ,column : coordinates of the cell on a board
Returns :
Boolean Value
"""
for i in range(len(board)):
if board[row][i] == 1:
return False
for i in range(len(board)):
if board[i][column] == 1:
return False
for i, j in zip(range(row, -1, -1), range(column, -1, -1)):
if board[i][j] == 1:
return False
for i, j in zip(range(row, -1, -1), range(column, len(board))):
if board[i][j] == 1:
return False
return True
def solve(board: list[list[int]], row: int) -> bool:
"""
It creates a state space tree and calls the safe function until it receives a
False Boolean and terminates that branch and backtracks to the next
possible solution branch.
"""
if row >= len(board):
"""
If the row number exceeds N we have board with a successful combination
and that combination is appended to the solution list and the board is printed.
"""
solution.append(board)
printboard(board)
print()
return True
for i in range(len(board)):
"""
For every row it iterates through each column to check if it is feasible to
place a queen there.
If all the combinations for that particular branch are successful the board is
reinitialized for the next possible combination.
"""
if isSafe(board, row, i):
board[row][i] = 1
solve(board, row + 1)
board[row][i] = 0
return False
def printboard(board: list[list[int]]) -> None:
"""
Prints the boards that have a successful combination.
"""
for i in range(len(board)):
for j in range(len(board)):
if board[i][j] == 1:
print("Q", end=" ")
else:
print(".", end=" ")
print()
# n=int(input("The no. of queens"))
n = 8
board = [[0 for i in range(n)] for j in range(n)]
solve(board, 0)
print("The total no. of solutions are :", len(solution))
| """
The nqueens problem is of placing N queens on a N * N
chess board such that no queen can attack any other queens placed
on that chess board.
This means that one queen cannot have any other queen on its horizontal, vertical and
diagonal lines.
"""
from __future__ import annotations
solution = []
def isSafe(board: list[list[int]], row: int, column: int) -> bool:
"""
This function returns a boolean value True if it is safe to place a queen there
considering the current state of the board.
Parameters :
board(2D matrix) : board
row ,column : coordinates of the cell on a board
Returns :
Boolean Value
"""
for i in range(len(board)):
if board[row][i] == 1:
return False
for i in range(len(board)):
if board[i][column] == 1:
return False
for i, j in zip(range(row, -1, -1), range(column, -1, -1)):
if board[i][j] == 1:
return False
for i, j in zip(range(row, -1, -1), range(column, len(board))):
if board[i][j] == 1:
return False
return True
def solve(board: list[list[int]], row: int) -> bool:
"""
It creates a state space tree and calls the safe function until it receives a
False Boolean and terminates that branch and backtracks to the next
possible solution branch.
"""
if row >= len(board):
"""
If the row number exceeds N we have board with a successful combination
and that combination is appended to the solution list and the board is printed.
"""
solution.append(board)
printboard(board)
print()
return True
for i in range(len(board)):
"""
For every row it iterates through each column to check if it is feasible to
place a queen there.
If all the combinations for that particular branch are successful the board is
reinitialized for the next possible combination.
"""
if isSafe(board, row, i):
board[row][i] = 1
solve(board, row + 1)
board[row][i] = 0
return False
def printboard(board: list[list[int]]) -> None:
"""
Prints the boards that have a successful combination.
"""
for i in range(len(board)):
for j in range(len(board)):
if board[i][j] == 1:
print("Q", end=" ")
else:
print(".", end=" ")
print()
# n=int(input("The no. of queens"))
n = 8
board = [[0 for i in range(n)] for j in range(n)]
solve(board, 0)
print("The total no. of solutions are :", len(solution))
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
||
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Project Euler Problem 8: https://projecteuler.net/problem=8
Largest product in a series
The four adjacent digits in the 1000-digit number that have the greatest
product are 9 × 9 × 8 × 9 = 5832.
73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450
Find the thirteen adjacent digits in the 1000-digit number that have the
greatest product. What is the value of this product?
"""
from functools import reduce
N = (
"73167176531330624919225119674426574742355349194934"
"96983520312774506326239578318016984801869478851843"
"85861560789112949495459501737958331952853208805511"
"12540698747158523863050715693290963295227443043557"
"66896648950445244523161731856403098711121722383113"
"62229893423380308135336276614282806444486645238749"
"30358907296290491560440772390713810515859307960866"
"70172427121883998797908792274921901699720888093776"
"65727333001053367881220235421809751254540594752243"
"52584907711670556013604839586446706324415722155397"
"53697817977846174064955149290862569321978468622482"
"83972241375657056057490261407972968652414535100474"
"82166370484403199890008895243450658541227588666881"
"16427171479924442928230863465674813919123162824586"
"17866458359124566529476545682848912883142607690042"
"24219022671055626321111109370544217506941658960408"
"07198403850962455444362981230987879927244284909188"
"84580156166097919133875499200524063689912560717606"
"05886116467109405077541002256983155200055935729725"
"71636269561882670428252483600823257530420752963450"
)
def solution(n: str = N) -> int:
"""
Find the thirteen adjacent digits in the 1000-digit number n that have
the greatest product and returns it.
>>> solution("13978431290823798458352374")
609638400
>>> solution("13978431295823798458352374")
2612736000
>>> solution("1397843129582379841238352374")
209018880
"""
return max(
# mypy cannot properly interpret reduce
int(reduce(lambda x, y: str(int(x) * int(y)), n[i : i + 13]))
for i in range(len(n) - 12)
)
if __name__ == "__main__":
print(f"{solution() = }")
| """
Project Euler Problem 8: https://projecteuler.net/problem=8
Largest product in a series
The four adjacent digits in the 1000-digit number that have the greatest
product are 9 × 9 × 8 × 9 = 5832.
73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450
Find the thirteen adjacent digits in the 1000-digit number that have the
greatest product. What is the value of this product?
"""
from functools import reduce
N = (
"73167176531330624919225119674426574742355349194934"
"96983520312774506326239578318016984801869478851843"
"85861560789112949495459501737958331952853208805511"
"12540698747158523863050715693290963295227443043557"
"66896648950445244523161731856403098711121722383113"
"62229893423380308135336276614282806444486645238749"
"30358907296290491560440772390713810515859307960866"
"70172427121883998797908792274921901699720888093776"
"65727333001053367881220235421809751254540594752243"
"52584907711670556013604839586446706324415722155397"
"53697817977846174064955149290862569321978468622482"
"83972241375657056057490261407972968652414535100474"
"82166370484403199890008895243450658541227588666881"
"16427171479924442928230863465674813919123162824586"
"17866458359124566529476545682848912883142607690042"
"24219022671055626321111109370544217506941658960408"
"07198403850962455444362981230987879927244284909188"
"84580156166097919133875499200524063689912560717606"
"05886116467109405077541002256983155200055935729725"
"71636269561882670428252483600823257530420752963450"
)
def solution(n: str = N) -> int:
"""
Find the thirteen adjacent digits in the 1000-digit number n that have
the greatest product and returns it.
>>> solution("13978431290823798458352374")
609638400
>>> solution("13978431295823798458352374")
2612736000
>>> solution("1397843129582379841238352374")
209018880
"""
return max(
# mypy cannot properly interpret reduce
int(reduce(lambda x, y: str(int(x) * int(y)), n[i : i + 13]))
for i in range(len(n) - 12)
)
if __name__ == "__main__":
print(f"{solution() = }")
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Project Euler Problem 65: https://projecteuler.net/problem=65
The square root of 2 can be written as an infinite continued fraction.
sqrt(2) = 1 + 1 / (2 + 1 / (2 + 1 / (2 + 1 / (2 + ...))))
The infinite continued fraction can be written, sqrt(2) = [1;(2)], (2)
indicates that 2 repeats ad infinitum. In a similar way, sqrt(23) =
[4;(1,3,1,8)].
It turns out that the sequence of partial values of continued
fractions for square roots provide the best rational approximations.
Let us consider the convergents for sqrt(2).
1 + 1 / 2 = 3/2
1 + 1 / (2 + 1 / 2) = 7/5
1 + 1 / (2 + 1 / (2 + 1 / 2)) = 17/12
1 + 1 / (2 + 1 / (2 + 1 / (2 + 1 / 2))) = 41/29
Hence the sequence of the first ten convergents for sqrt(2) are:
1, 3/2, 7/5, 17/12, 41/29, 99/70, 239/169, 577/408, 1393/985, 3363/2378, ...
What is most surprising is that the important mathematical constant,
e = [2;1,2,1,1,4,1,1,6,1,...,1,2k,1,...].
The first ten terms in the sequence of convergents for e are:
2, 3, 8/3, 11/4, 19/7, 87/32, 106/39, 193/71, 1264/465, 1457/536, ...
The sum of digits in the numerator of the 10th convergent is
1 + 4 + 5 + 7 = 17.
Find the sum of the digits in the numerator of the 100th convergent
of the continued fraction for e.
-----
The solution mostly comes down to finding an equation that will generate
the numerator of the continued fraction. For the i-th numerator, the
pattern is:
n_i = m_i * n_(i-1) + n_(i-2)
for m_i = the i-th index of the continued fraction representation of e,
n_0 = 1, and n_1 = 2 as the first 2 numbers of the representation.
For example:
n_9 = 6 * 193 + 106 = 1264
1 + 2 + 6 + 4 = 13
n_10 = 1 * 193 + 1264 = 1457
1 + 4 + 5 + 7 = 17
"""
def sum_digits(num: int) -> int:
"""
Returns the sum of every digit in num.
>>> sum_digits(1)
1
>>> sum_digits(12345)
15
>>> sum_digits(999001)
28
"""
digit_sum = 0
while num > 0:
digit_sum += num % 10
num //= 10
return digit_sum
def solution(max: int = 100) -> int:
"""
Returns the sum of the digits in the numerator of the max-th convergent of
the continued fraction for e.
>>> solution(9)
13
>>> solution(10)
17
>>> solution(50)
91
"""
pre_numerator = 1
cur_numerator = 2
for i in range(2, max + 1):
temp = pre_numerator
e_cont = 2 * i // 3 if i % 3 == 0 else 1
pre_numerator = cur_numerator
cur_numerator = e_cont * pre_numerator + temp
return sum_digits(cur_numerator)
if __name__ == "__main__":
print(f"{solution() = }")
| """
Project Euler Problem 65: https://projecteuler.net/problem=65
The square root of 2 can be written as an infinite continued fraction.
sqrt(2) = 1 + 1 / (2 + 1 / (2 + 1 / (2 + 1 / (2 + ...))))
The infinite continued fraction can be written, sqrt(2) = [1;(2)], (2)
indicates that 2 repeats ad infinitum. In a similar way, sqrt(23) =
[4;(1,3,1,8)].
It turns out that the sequence of partial values of continued
fractions for square roots provide the best rational approximations.
Let us consider the convergents for sqrt(2).
1 + 1 / 2 = 3/2
1 + 1 / (2 + 1 / 2) = 7/5
1 + 1 / (2 + 1 / (2 + 1 / 2)) = 17/12
1 + 1 / (2 + 1 / (2 + 1 / (2 + 1 / 2))) = 41/29
Hence the sequence of the first ten convergents for sqrt(2) are:
1, 3/2, 7/5, 17/12, 41/29, 99/70, 239/169, 577/408, 1393/985, 3363/2378, ...
What is most surprising is that the important mathematical constant,
e = [2;1,2,1,1,4,1,1,6,1,...,1,2k,1,...].
The first ten terms in the sequence of convergents for e are:
2, 3, 8/3, 11/4, 19/7, 87/32, 106/39, 193/71, 1264/465, 1457/536, ...
The sum of digits in the numerator of the 10th convergent is
1 + 4 + 5 + 7 = 17.
Find the sum of the digits in the numerator of the 100th convergent
of the continued fraction for e.
-----
The solution mostly comes down to finding an equation that will generate
the numerator of the continued fraction. For the i-th numerator, the
pattern is:
n_i = m_i * n_(i-1) + n_(i-2)
for m_i = the i-th index of the continued fraction representation of e,
n_0 = 1, and n_1 = 2 as the first 2 numbers of the representation.
For example:
n_9 = 6 * 193 + 106 = 1264
1 + 2 + 6 + 4 = 13
n_10 = 1 * 193 + 1264 = 1457
1 + 4 + 5 + 7 = 17
"""
def sum_digits(num: int) -> int:
"""
Returns the sum of every digit in num.
>>> sum_digits(1)
1
>>> sum_digits(12345)
15
>>> sum_digits(999001)
28
"""
digit_sum = 0
while num > 0:
digit_sum += num % 10
num //= 10
return digit_sum
def solution(max: int = 100) -> int:
"""
Returns the sum of the digits in the numerator of the max-th convergent of
the continued fraction for e.
>>> solution(9)
13
>>> solution(10)
17
>>> solution(50)
91
"""
pre_numerator = 1
cur_numerator = 2
for i in range(2, max + 1):
temp = pre_numerator
e_cont = 2 * i // 3 if i % 3 == 0 else 1
pre_numerator = cur_numerator
cur_numerator = e_cont * pre_numerator + temp
return sum_digits(cur_numerator)
if __name__ == "__main__":
print(f"{solution() = }")
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
== Raise base to the power of exponent using recursion ==
Input -->
Enter the base: 3
Enter the exponent: 4
Output -->
3 to the power of 4 is 81
Input -->
Enter the base: 2
Enter the exponent: 0
Output -->
2 to the power of 0 is 1
"""
def power(base: int, exponent: int) -> float:
"""
power(3, 4)
81
>>> power(2, 0)
1
>>> all(power(base, exponent) == pow(base, exponent)
... for base in range(-10, 10) for exponent in range(10))
True
"""
return base * power(base, (exponent - 1)) if exponent else 1
if __name__ == "__main__":
print("Raise base to the power of exponent using recursion...")
base = int(input("Enter the base: ").strip())
exponent = int(input("Enter the exponent: ").strip())
result = power(base, abs(exponent))
if exponent < 0: # power() does not properly deal w/ negative exponents
result = 1 / result
print(f"{base} to the power of {exponent} is {result}")
| """
== Raise base to the power of exponent using recursion ==
Input -->
Enter the base: 3
Enter the exponent: 4
Output -->
3 to the power of 4 is 81
Input -->
Enter the base: 2
Enter the exponent: 0
Output -->
2 to the power of 0 is 1
"""
def power(base: int, exponent: int) -> float:
"""
power(3, 4)
81
>>> power(2, 0)
1
>>> all(power(base, exponent) == pow(base, exponent)
... for base in range(-10, 10) for exponent in range(10))
True
"""
return base * power(base, (exponent - 1)) if exponent else 1
if __name__ == "__main__":
print("Raise base to the power of exponent using recursion...")
base = int(input("Enter the base: ").strip())
exponent = int(input("Enter the exponent: ").strip())
result = power(base, abs(exponent))
if exponent < 0: # power() does not properly deal w/ negative exponents
result = 1 / result
print(f"{base} to the power of {exponent} is {result}")
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Implementation of gaussian filter algorithm
"""
from itertools import product
from cv2 import COLOR_BGR2GRAY, cvtColor, imread, imshow, waitKey
from numpy import dot, exp, mgrid, pi, ravel, square, uint8, zeros
def gen_gaussian_kernel(k_size, sigma):
center = k_size // 2
x, y = mgrid[0 - center : k_size - center, 0 - center : k_size - center]
g = 1 / (2 * pi * sigma) * exp(-(square(x) + square(y)) / (2 * square(sigma)))
return g
def gaussian_filter(image, k_size, sigma):
height, width = image.shape[0], image.shape[1]
# dst image height and width
dst_height = height - k_size + 1
dst_width = width - k_size + 1
# im2col, turn the k_size*k_size pixels into a row and np.vstack all rows
image_array = zeros((dst_height * dst_width, k_size * k_size))
row = 0
for i, j in product(range(dst_height), range(dst_width)):
window = ravel(image[i : i + k_size, j : j + k_size])
image_array[row, :] = window
row += 1
# turn the kernel into shape(k*k, 1)
gaussian_kernel = gen_gaussian_kernel(k_size, sigma)
filter_array = ravel(gaussian_kernel)
# reshape and get the dst image
dst = dot(image_array, filter_array).reshape(dst_height, dst_width).astype(uint8)
return dst
if __name__ == "__main__":
# read original image
img = imread(r"../image_data/lena.jpg")
# turn image in gray scale value
gray = cvtColor(img, COLOR_BGR2GRAY)
# get values with two different mask size
gaussian3x3 = gaussian_filter(gray, 3, sigma=1)
gaussian5x5 = gaussian_filter(gray, 5, sigma=0.8)
# show result images
imshow("gaussian filter with 3x3 mask", gaussian3x3)
imshow("gaussian filter with 5x5 mask", gaussian5x5)
waitKey()
| """
Implementation of gaussian filter algorithm
"""
from itertools import product
from cv2 import COLOR_BGR2GRAY, cvtColor, imread, imshow, waitKey
from numpy import dot, exp, mgrid, pi, ravel, square, uint8, zeros
def gen_gaussian_kernel(k_size, sigma):
center = k_size // 2
x, y = mgrid[0 - center : k_size - center, 0 - center : k_size - center]
g = 1 / (2 * pi * sigma) * exp(-(square(x) + square(y)) / (2 * square(sigma)))
return g
def gaussian_filter(image, k_size, sigma):
height, width = image.shape[0], image.shape[1]
# dst image height and width
dst_height = height - k_size + 1
dst_width = width - k_size + 1
# im2col, turn the k_size*k_size pixels into a row and np.vstack all rows
image_array = zeros((dst_height * dst_width, k_size * k_size))
row = 0
for i, j in product(range(dst_height), range(dst_width)):
window = ravel(image[i : i + k_size, j : j + k_size])
image_array[row, :] = window
row += 1
# turn the kernel into shape(k*k, 1)
gaussian_kernel = gen_gaussian_kernel(k_size, sigma)
filter_array = ravel(gaussian_kernel)
# reshape and get the dst image
dst = dot(image_array, filter_array).reshape(dst_height, dst_width).astype(uint8)
return dst
if __name__ == "__main__":
# read original image
img = imread(r"../image_data/lena.jpg")
# turn image in gray scale value
gray = cvtColor(img, COLOR_BGR2GRAY)
# get values with two different mask size
gaussian3x3 = gaussian_filter(gray, 3, sigma=1)
gaussian5x5 = gaussian_filter(gray, 5, sigma=0.8)
# show result images
imshow("gaussian filter with 3x3 mask", gaussian3x3)
imshow("gaussian filter with 5x5 mask", gaussian5x5)
waitKey()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
Demonstrates implementation of SHA1 Hash function in a Python class and gives utilities
to find hash of string or hash of text from a file.
Usage: python sha1.py --string "Hello World!!"
python sha1.py --file "hello_world.txt"
When run without any arguments, it prints the hash of the string "Hello World!!
Welcome to Cryptography"
Also contains a Test class to verify that the generated Hash is same as that
returned by the hashlib library
SHA1 hash or SHA1 sum of a string is a cryptographic function which means it is easy
to calculate forwards but extremely difficult to calculate backwards. What this means
is, you can easily calculate the hash of a string, but it is extremely difficult to
know the original string if you have its hash. This property is useful to communicate
securely, send encrypted messages and is very useful in payment systems, blockchain
and cryptocurrency etc.
The Algorithm as described in the reference:
First we start with a message. The message is padded and the length of the message
is added to the end. It is then split into blocks of 512 bits or 64 bytes. The blocks
are then processed one at a time. Each block must be expanded and compressed.
The value after each compression is added to a 160bit buffer called the current hash
state. After the last block is processed the current hash state is returned as
the final hash.
Reference: https://deadhacker.com/2006/02/21/sha-1-illustrated/
"""
import argparse
import hashlib # hashlib is only used inside the Test class
import struct
import unittest
class SHA1Hash:
"""
Class to contain the entire pipeline for SHA1 Hashing Algorithm
>>> SHA1Hash(bytes('Allan', 'utf-8')).final_hash()
'872af2d8ac3d8695387e7c804bf0e02c18df9e6e'
"""
def __init__(self, data):
"""
Inititates the variables data and h. h is a list of 5 8-digit Hexadecimal
numbers corresponding to
(1732584193, 4023233417, 2562383102, 271733878, 3285377520)
respectively. We will start with this as a message digest. 0x is how you write
Hexadecimal numbers in Python
"""
self.data = data
self.h = [0x67452301, 0xEFCDAB89, 0x98BADCFE, 0x10325476, 0xC3D2E1F0]
@staticmethod
def rotate(n, b):
"""
Static method to be used inside other methods. Left rotates n by b.
>>> SHA1Hash('').rotate(12,2)
48
"""
return ((n << b) | (n >> (32 - b))) & 0xFFFFFFFF
def padding(self):
"""
Pads the input message with zeros so that padded_data has 64 bytes or 512 bits
"""
padding = b"\x80" + b"\x00" * (63 - (len(self.data) + 8) % 64)
padded_data = self.data + padding + struct.pack(">Q", 8 * len(self.data))
return padded_data
def split_blocks(self):
"""
Returns a list of bytestrings each of length 64
"""
return [
self.padded_data[i : i + 64] for i in range(0, len(self.padded_data), 64)
]
# @staticmethod
def expand_block(self, block):
"""
Takes a bytestring-block of length 64, unpacks it to a list of integers and
returns a list of 80 integers after some bit operations
"""
w = list(struct.unpack(">16L", block)) + [0] * 64
for i in range(16, 80):
w[i] = self.rotate((w[i - 3] ^ w[i - 8] ^ w[i - 14] ^ w[i - 16]), 1)
return w
def final_hash(self):
"""
Calls all the other methods to process the input. Pads the data, then splits
into blocks and then does a series of operations for each block (including
expansion).
For each block, the variable h that was initialized is copied to a,b,c,d,e
and these 5 variables a,b,c,d,e undergo several changes. After all the blocks
are processed, these 5 variables are pairwise added to h ie a to h[0], b to h[1]
and so on. This h becomes our final hash which is returned.
"""
self.padded_data = self.padding()
self.blocks = self.split_blocks()
for block in self.blocks:
expanded_block = self.expand_block(block)
a, b, c, d, e = self.h
for i in range(0, 80):
if 0 <= i < 20:
f = (b & c) | ((~b) & d)
k = 0x5A827999
elif 20 <= i < 40:
f = b ^ c ^ d
k = 0x6ED9EBA1
elif 40 <= i < 60:
f = (b & c) | (b & d) | (c & d)
k = 0x8F1BBCDC
elif 60 <= i < 80:
f = b ^ c ^ d
k = 0xCA62C1D6
a, b, c, d, e = (
self.rotate(a, 5) + f + e + k + expanded_block[i] & 0xFFFFFFFF,
a,
self.rotate(b, 30),
c,
d,
)
self.h = (
self.h[0] + a & 0xFFFFFFFF,
self.h[1] + b & 0xFFFFFFFF,
self.h[2] + c & 0xFFFFFFFF,
self.h[3] + d & 0xFFFFFFFF,
self.h[4] + e & 0xFFFFFFFF,
)
return "%08x%08x%08x%08x%08x" % tuple(self.h)
class SHA1HashTest(unittest.TestCase):
"""
Test class for the SHA1Hash class. Inherits the TestCase class from unittest
"""
def testMatchHashes(self):
msg = bytes("Test String", "utf-8")
self.assertEqual(SHA1Hash(msg).final_hash(), hashlib.sha1(msg).hexdigest())
def main():
"""
Provides option 'string' or 'file' to take input and prints the calculated SHA1
hash. unittest.main() has been commented because we probably don't want to run
the test each time.
"""
# unittest.main()
parser = argparse.ArgumentParser(description="Process some strings or files")
parser.add_argument(
"--string",
dest="input_string",
default="Hello World!! Welcome to Cryptography",
help="Hash the string",
)
parser.add_argument("--file", dest="input_file", help="Hash contents of a file")
args = parser.parse_args()
input_string = args.input_string
# In any case hash input should be a bytestring
if args.input_file:
with open(args.input_file, "rb") as f:
hash_input = f.read()
else:
hash_input = bytes(input_string, "utf-8")
print(SHA1Hash(hash_input).final_hash())
if __name__ == "__main__":
main()
import doctest
doctest.testmod()
| """
Demonstrates implementation of SHA1 Hash function in a Python class and gives utilities
to find hash of string or hash of text from a file.
Usage: python sha1.py --string "Hello World!!"
python sha1.py --file "hello_world.txt"
When run without any arguments, it prints the hash of the string "Hello World!!
Welcome to Cryptography"
Also contains a Test class to verify that the generated Hash is same as that
returned by the hashlib library
SHA1 hash or SHA1 sum of a string is a cryptographic function which means it is easy
to calculate forwards but extremely difficult to calculate backwards. What this means
is, you can easily calculate the hash of a string, but it is extremely difficult to
know the original string if you have its hash. This property is useful to communicate
securely, send encrypted messages and is very useful in payment systems, blockchain
and cryptocurrency etc.
The Algorithm as described in the reference:
First we start with a message. The message is padded and the length of the message
is added to the end. It is then split into blocks of 512 bits or 64 bytes. The blocks
are then processed one at a time. Each block must be expanded and compressed.
The value after each compression is added to a 160bit buffer called the current hash
state. After the last block is processed the current hash state is returned as
the final hash.
Reference: https://deadhacker.com/2006/02/21/sha-1-illustrated/
"""
import argparse
import hashlib # hashlib is only used inside the Test class
import struct
import unittest
class SHA1Hash:
"""
Class to contain the entire pipeline for SHA1 Hashing Algorithm
>>> SHA1Hash(bytes('Allan', 'utf-8')).final_hash()
'872af2d8ac3d8695387e7c804bf0e02c18df9e6e'
"""
def __init__(self, data):
"""
Inititates the variables data and h. h is a list of 5 8-digit Hexadecimal
numbers corresponding to
(1732584193, 4023233417, 2562383102, 271733878, 3285377520)
respectively. We will start with this as a message digest. 0x is how you write
Hexadecimal numbers in Python
"""
self.data = data
self.h = [0x67452301, 0xEFCDAB89, 0x98BADCFE, 0x10325476, 0xC3D2E1F0]
@staticmethod
def rotate(n, b):
"""
Static method to be used inside other methods. Left rotates n by b.
>>> SHA1Hash('').rotate(12,2)
48
"""
return ((n << b) | (n >> (32 - b))) & 0xFFFFFFFF
def padding(self):
"""
Pads the input message with zeros so that padded_data has 64 bytes or 512 bits
"""
padding = b"\x80" + b"\x00" * (63 - (len(self.data) + 8) % 64)
padded_data = self.data + padding + struct.pack(">Q", 8 * len(self.data))
return padded_data
def split_blocks(self):
"""
Returns a list of bytestrings each of length 64
"""
return [
self.padded_data[i : i + 64] for i in range(0, len(self.padded_data), 64)
]
# @staticmethod
def expand_block(self, block):
"""
Takes a bytestring-block of length 64, unpacks it to a list of integers and
returns a list of 80 integers after some bit operations
"""
w = list(struct.unpack(">16L", block)) + [0] * 64
for i in range(16, 80):
w[i] = self.rotate((w[i - 3] ^ w[i - 8] ^ w[i - 14] ^ w[i - 16]), 1)
return w
def final_hash(self):
"""
Calls all the other methods to process the input. Pads the data, then splits
into blocks and then does a series of operations for each block (including
expansion).
For each block, the variable h that was initialized is copied to a,b,c,d,e
and these 5 variables a,b,c,d,e undergo several changes. After all the blocks
are processed, these 5 variables are pairwise added to h ie a to h[0], b to h[1]
and so on. This h becomes our final hash which is returned.
"""
self.padded_data = self.padding()
self.blocks = self.split_blocks()
for block in self.blocks:
expanded_block = self.expand_block(block)
a, b, c, d, e = self.h
for i in range(0, 80):
if 0 <= i < 20:
f = (b & c) | ((~b) & d)
k = 0x5A827999
elif 20 <= i < 40:
f = b ^ c ^ d
k = 0x6ED9EBA1
elif 40 <= i < 60:
f = (b & c) | (b & d) | (c & d)
k = 0x8F1BBCDC
elif 60 <= i < 80:
f = b ^ c ^ d
k = 0xCA62C1D6
a, b, c, d, e = (
self.rotate(a, 5) + f + e + k + expanded_block[i] & 0xFFFFFFFF,
a,
self.rotate(b, 30),
c,
d,
)
self.h = (
self.h[0] + a & 0xFFFFFFFF,
self.h[1] + b & 0xFFFFFFFF,
self.h[2] + c & 0xFFFFFFFF,
self.h[3] + d & 0xFFFFFFFF,
self.h[4] + e & 0xFFFFFFFF,
)
return "%08x%08x%08x%08x%08x" % tuple(self.h)
class SHA1HashTest(unittest.TestCase):
"""
Test class for the SHA1Hash class. Inherits the TestCase class from unittest
"""
def testMatchHashes(self):
msg = bytes("Test String", "utf-8")
self.assertEqual(SHA1Hash(msg).final_hash(), hashlib.sha1(msg).hexdigest())
def main():
"""
Provides option 'string' or 'file' to take input and prints the calculated SHA1
hash. unittest.main() has been commented because we probably don't want to run
the test each time.
"""
# unittest.main()
parser = argparse.ArgumentParser(description="Process some strings or files")
parser.add_argument(
"--string",
dest="input_string",
default="Hello World!! Welcome to Cryptography",
help="Hash the string",
)
parser.add_argument("--file", dest="input_file", help="Hash contents of a file")
args = parser.parse_args()
input_string = args.input_string
# In any case hash input should be a bytestring
if args.input_file:
with open(args.input_file, "rb") as f:
hash_input = f.read()
else:
hash_input = bytes(input_string, "utf-8")
print(SHA1Hash(hash_input).final_hash())
if __name__ == "__main__":
main()
import doctest
doctest.testmod()
| -1 |
TheAlgorithms/Python | 6,246 | Get rid of the Union | ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-11T10:58:09Z" | "2022-07-11T11:11:17Z" | ba129de7f32b6acd1efd8e942aca109bacd86646 | dad789d9034ea6fb183bddb1a34b6b89d379e422 | Get rid of the Union. ### Describe your change:
* [ ] Add an algorithm?
* [x] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [x] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| """
This algorithm was created for sdbm (a public-domain reimplementation of ndbm)
database library.
It was found to do well in scrambling bits, causing better distribution of the keys
and fewer splits.
It also happens to be a good general hashing function with good distribution.
The actual function (pseudo code) is:
for i in i..len(str):
hash(i) = hash(i - 1) * 65599 + str[i];
What is included below is the faster version used in gawk. [there is even a faster,
duff-device version]
The magic constant 65599 was picked out of thin air while experimenting with
different constants.
It turns out to be a prime.
This is one of the algorithms used in berkeley db (see sleepycat) and elsewhere.
source: http://www.cse.yorku.ca/~oz/hash.html
"""
def sdbm(plain_text: str) -> int:
"""
Function implements sdbm hash, easy to use, great for bits scrambling.
iterates over each character in the given string and applies function to each of
them.
>>> sdbm('Algorithms')
1462174910723540325254304520539387479031000036
>>> sdbm('scramble bits')
730247649148944819640658295400555317318720608290373040936089
"""
hash = 0
for plain_chr in plain_text:
hash = ord(plain_chr) + (hash << 6) + (hash << 16) - hash
return hash
| """
This algorithm was created for sdbm (a public-domain reimplementation of ndbm)
database library.
It was found to do well in scrambling bits, causing better distribution of the keys
and fewer splits.
It also happens to be a good general hashing function with good distribution.
The actual function (pseudo code) is:
for i in i..len(str):
hash(i) = hash(i - 1) * 65599 + str[i];
What is included below is the faster version used in gawk. [there is even a faster,
duff-device version]
The magic constant 65599 was picked out of thin air while experimenting with
different constants.
It turns out to be a prime.
This is one of the algorithms used in berkeley db (see sleepycat) and elsewhere.
source: http://www.cse.yorku.ca/~oz/hash.html
"""
def sdbm(plain_text: str) -> int:
"""
Function implements sdbm hash, easy to use, great for bits scrambling.
iterates over each character in the given string and applies function to each of
them.
>>> sdbm('Algorithms')
1462174910723540325254304520539387479031000036
>>> sdbm('scramble bits')
730247649148944819640658295400555317318720608290373040936089
"""
hash = 0
for plain_chr in plain_text:
hash = ord(plain_chr) + (hash << 6) + (hash << 16) - hash
return hash
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| name: "build"
on:
pull_request:
schedule:
- cron: "0 0 * * *" # Run everyday
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: "3.10"
- uses: actions/cache@v2
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools six wheel
python -m pip install pytest-cov -r requirements.txt
- name: Run tests
run: pytest --doctest-modules --ignore=project_euler/ --ignore=scripts/validate_solutions.py --cov-report=term-missing:skip-covered --cov=. .
- if: ${{ success() }}
run: scripts/build_directory_md.py 2>&1 | tee DIRECTORY.md
| name: "build"
on:
pull_request:
schedule:
- cron: "0 0 * * *" # Run everyday
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: 3.x
- uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools six wheel
python -m pip install pytest-cov -r requirements.txt
- name: Run tests
run: pytest --doctest-modules --ignore=project_euler/ --ignore=scripts/validate_solutions.py --cov-report=term-missing:skip-covered --cov=. .
- if: ${{ success() }}
run: scripts/build_directory_md.py 2>&1 | tee DIRECTORY.md
| 1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # The objective of this GitHub Action is to update the DIRECTORY.md file (if needed)
# when doing a git push
name: directory_writer
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1 # v1, NOT v2
- uses: actions/setup-python@v2
- name: Write DIRECTORY.md
run: |
scripts/build_directory_md.py 2>&1 | tee DIRECTORY.md
git config --global user.name github-actions
git config --global user.email '${GITHUB_ACTOR}@users.noreply.github.com'
git remote set-url origin https://x-access-token:${{ secrets.GITHUB_TOKEN }}@github.com/$GITHUB_REPOSITORY
- name: Update DIRECTORY.md
run: |
git add DIRECTORY.md
git commit -am "updating DIRECTORY.md" || true
git push --force origin HEAD:$GITHUB_REF || true
| # The objective of this GitHub Action is to update the DIRECTORY.md file (if needed)
# when doing a git push
name: directory_writer
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1 # v1, NOT v2 or v3
- uses: actions/setup-python@v4
with:
python-version: 3.x
- name: Write DIRECTORY.md
run: |
scripts/build_directory_md.py 2>&1 | tee DIRECTORY.md
git config --global user.name github-actions
git config --global user.email '${GITHUB_ACTOR}@users.noreply.github.com'
git remote set-url origin https://x-access-token:${{ secrets.GITHUB_TOKEN }}@github.com/$GITHUB_REPOSITORY
- name: Update DIRECTORY.md
run: |
git add DIRECTORY.md
git commit -am "updating DIRECTORY.md" || true
git push --force origin HEAD:$GITHUB_REF || true
| 1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| name: pre-commit
on: [push, pull_request]
jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/cache@v2
with:
path: |
~/.cache/pre-commit
~/.cache/pip
key: ${{ runner.os }}-pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
- uses: actions/setup-python@v2
with:
python-version: "3.10"
- uses: psf/[email protected]
- name: Install pre-commit
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pre-commit
- run: pre-commit run --verbose --all-files --show-diff-on-failure
| name: pre-commit
on: [push, pull_request]
jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/cache@v3
with:
path: |
~/.cache/pre-commit
~/.cache/pip
key: ${{ runner.os }}-pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
- uses: actions/setup-python@v4
with:
python-version: 3.x
# - uses: psf/[email protected]
- name: Install pre-commit
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pre-commit
- run: pre-commit run --verbose --all-files --show-diff-on-failure
| 1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| on:
pull_request:
# Run only if a file is changed within the project_euler directory and related files
paths:
- "project_euler/**"
- ".github/workflows/project_euler.yml"
- "scripts/validate_solutions.py"
schedule:
- cron: "0 0 * * *" # Run everyday
name: "Project Euler"
jobs:
project-euler:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- name: Install pytest and pytest-cov
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pytest pytest-cov
- run: pytest --doctest-modules --cov-report=term-missing:skip-covered --cov=project_euler/ project_euler/
validate-solutions:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- name: Install pytest and requests
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pytest requests
- run: pytest scripts/validate_solutions.py
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
| on:
pull_request:
# Run only if a file is changed within the project_euler directory and related files
paths:
- "project_euler/**"
- ".github/workflows/project_euler.yml"
- "scripts/validate_solutions.py"
schedule:
- cron: "0 0 * * *" # Run everyday
name: "Project Euler"
jobs:
project-euler:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: 3.x
- name: Install pytest and pytest-cov
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pytest pytest-cov
- run: pytest --doctest-modules --cov-report=term-missing:skip-covered --cov=project_euler/ project_euler/
validate-solutions:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: 3.x
- name: Install pytest and requests
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pytest requests
- run: pytest scripts/validate_solutions.py
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
| 1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.1.0
hooks:
- id: check-executables-have-shebangs
- id: check-yaml
- id: end-of-file-fixer
types: [python]
- id: trailing-whitespace
exclude: |
(?x)^(
data_structures/heap/binomial_heap.py
)$
- id: requirements-txt-fixer
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: black
- repo: https://github.com/PyCQA/isort
rev: 5.10.1
hooks:
- id: isort
args:
- --profile=black
- repo: https://github.com/asottile/pyupgrade
rev: v2.31.0
hooks:
- id: pyupgrade
args:
- --py310-plus
- repo: https://gitlab.com/pycqa/flake8
rev: 3.9.2
hooks:
- id: flake8
args:
- --ignore=E203,W503
- --max-complexity=25
- --max-line-length=88
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.931
hooks:
- id: mypy
args:
- --ignore-missing-imports
- --install-types # See mirrors-mypy README.md
- --non-interactive
- repo: https://github.com/codespell-project/codespell
rev: v2.1.0
hooks:
- id: codespell
args:
- --ignore-words-list=ans,crate,fo,followings,hist,iff,mater,secant,som,sur,tim
- --skip="./.*,./strings/dictionary.txt,./strings/words.txt,./project_euler/problem_022/p022_names.txt"
exclude: |
(?x)^(
strings/dictionary.txt |
strings/words.txt |
project_euler/problem_022/p022_names.txt
)$
- repo: local
hooks:
- id: validate-filenames
name: Validate filenames
entry: ./scripts/validate_filenames.py
language: script
pass_filenames: false
| repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
hooks:
- id: check-executables-have-shebangs
- id: check-yaml
- id: end-of-file-fixer
types: [python]
- id: trailing-whitespace
exclude: |
(?x)^(
data_structures/heap/binomial_heap.py
)$
- id: requirements-txt-fixer
- repo: https://github.com/psf/black
rev: 22.6.0
hooks:
- id: black
- repo: https://github.com/PyCQA/isort
rev: 5.10.1
hooks:
- id: isort
args:
- --profile=black
- repo: https://github.com/asottile/pyupgrade
rev: v2.34.0
hooks:
- id: pyupgrade
args:
- --py310-plus
- repo: https://gitlab.com/pycqa/flake8
rev: 3.9.2
hooks:
- id: flake8
args:
- --ignore=E203,W503
- --max-complexity=25
- --max-line-length=88
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.961
hooks:
- id: mypy
args:
- --ignore-missing-imports
- --install-types # See mirrors-mypy README.md
- --non-interactive
- repo: https://github.com/codespell-project/codespell
rev: v2.1.0
hooks:
- id: codespell
args:
- --ignore-words-list=ans,crate,fo,followings,hist,iff,mater,secant,som,sur,tim
- --skip="./.*,./strings/dictionary.txt,./strings/words.txt,./project_euler/problem_022/p022_names.txt"
exclude: |
(?x)^(
strings/dictionary.txt |
strings/words.txt |
project_euler/problem_022/p022_names.txt
)$
- repo: local
hooks:
- id: validate-filenames
name: Validate filenames
entry: ./scripts/validate_filenames.py
language: script
pass_filenames: false
| 1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
|
## Arithmetic Analysis
* [Bisection](arithmetic_analysis/bisection.py)
* [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py)
* [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py)
* [Intersection](arithmetic_analysis/intersection.py)
* [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py)
* [Lu Decomposition](arithmetic_analysis/lu_decomposition.py)
* [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py)
* [Newton Method](arithmetic_analysis/newton_method.py)
* [Newton Raphson](arithmetic_analysis/newton_raphson.py)
* [Secant Method](arithmetic_analysis/secant_method.py)
## Audio Filters
* [Butterworth Filter](audio_filters/butterworth_filter.py)
* [Iir Filter](audio_filters/iir_filter.py)
* [Show Response](audio_filters/show_response.py)
## Backtracking
* [All Combinations](backtracking/all_combinations.py)
* [All Permutations](backtracking/all_permutations.py)
* [All Subsequences](backtracking/all_subsequences.py)
* [Coloring](backtracking/coloring.py)
* [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py)
* [Knight Tour](backtracking/knight_tour.py)
* [Minimax](backtracking/minimax.py)
* [N Queens](backtracking/n_queens.py)
* [N Queens Math](backtracking/n_queens_math.py)
* [Rat In Maze](backtracking/rat_in_maze.py)
* [Sudoku](backtracking/sudoku.py)
* [Sum Of Subsets](backtracking/sum_of_subsets.py)
## Bit Manipulation
* [Binary And Operator](bit_manipulation/binary_and_operator.py)
* [Binary Count Setbits](bit_manipulation/binary_count_setbits.py)
* [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py)
* [Binary Or Operator](bit_manipulation/binary_or_operator.py)
* [Binary Shifts](bit_manipulation/binary_shifts.py)
* [Binary Twos Complement](bit_manipulation/binary_twos_complement.py)
* [Binary Xor Operator](bit_manipulation/binary_xor_operator.py)
* [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py)
* [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py)
* [Gray Code Sequence](bit_manipulation/gray_code_sequence.py)
* [Reverse Bits](bit_manipulation/reverse_bits.py)
* [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py)
## Blockchain
* [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py)
* [Diophantine Equation](blockchain/diophantine_equation.py)
* [Modular Division](blockchain/modular_division.py)
## Boolean Algebra
* [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py)
## Cellular Automata
* [Conways Game Of Life](cellular_automata/conways_game_of_life.py)
* [Game Of Life](cellular_automata/game_of_life.py)
* [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py)
* [One Dimensional](cellular_automata/one_dimensional.py)
## Ciphers
* [A1Z26](ciphers/a1z26.py)
* [Affine Cipher](ciphers/affine_cipher.py)
* [Atbash](ciphers/atbash.py)
* [Baconian Cipher](ciphers/baconian_cipher.py)
* [Base16](ciphers/base16.py)
* [Base32](ciphers/base32.py)
* [Base64](ciphers/base64.py)
* [Base85](ciphers/base85.py)
* [Beaufort Cipher](ciphers/beaufort_cipher.py)
* [Bifid](ciphers/bifid.py)
* [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py)
* [Caesar Cipher](ciphers/caesar_cipher.py)
* [Cryptomath Module](ciphers/cryptomath_module.py)
* [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py)
* [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py)
* [Diffie](ciphers/diffie.py)
* [Diffie Hellman](ciphers/diffie_hellman.py)
* [Elgamal Key Generator](ciphers/elgamal_key_generator.py)
* [Enigma Machine2](ciphers/enigma_machine2.py)
* [Hill Cipher](ciphers/hill_cipher.py)
* [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py)
* [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py)
* [Morse Code](ciphers/morse_code.py)
* [Onepad Cipher](ciphers/onepad_cipher.py)
* [Playfair Cipher](ciphers/playfair_cipher.py)
* [Polybius](ciphers/polybius.py)
* [Porta Cipher](ciphers/porta_cipher.py)
* [Rabin Miller](ciphers/rabin_miller.py)
* [Rail Fence Cipher](ciphers/rail_fence_cipher.py)
* [Rot13](ciphers/rot13.py)
* [Rsa Cipher](ciphers/rsa_cipher.py)
* [Rsa Factorization](ciphers/rsa_factorization.py)
* [Rsa Key Generator](ciphers/rsa_key_generator.py)
* [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py)
* [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py)
* [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py)
* [Trafid Cipher](ciphers/trafid_cipher.py)
* [Transposition Cipher](ciphers/transposition_cipher.py)
* [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py)
* [Vigenere Cipher](ciphers/vigenere_cipher.py)
* [Xor Cipher](ciphers/xor_cipher.py)
## Compression
* [Burrows Wheeler](compression/burrows_wheeler.py)
* [Huffman](compression/huffman.py)
* [Lempel Ziv](compression/lempel_ziv.py)
* [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py)
* [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py)
## Computer Vision
* [Cnn Classification](computer_vision/cnn_classification.py)
* [Flip Augmentation](computer_vision/flip_augmentation.py)
* [Harris Corner](computer_vision/harris_corner.py)
* [Horn Schunck](computer_vision/horn_schunck.py)
* [Mean Threshold](computer_vision/mean_threshold.py)
* [Mosaic Augmentation](computer_vision/mosaic_augmentation.py)
* [Pooling Functions](computer_vision/pooling_functions.py)
## Conversions
* [Binary To Decimal](conversions/binary_to_decimal.py)
* [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py)
* [Binary To Octal](conversions/binary_to_octal.py)
* [Decimal To Any](conversions/decimal_to_any.py)
* [Decimal To Binary](conversions/decimal_to_binary.py)
* [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py)
* [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py)
* [Decimal To Octal](conversions/decimal_to_octal.py)
* [Excel Title To Column](conversions/excel_title_to_column.py)
* [Hex To Bin](conversions/hex_to_bin.py)
* [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py)
* [Length Conversion](conversions/length_conversion.py)
* [Molecular Chemistry](conversions/molecular_chemistry.py)
* [Octal To Decimal](conversions/octal_to_decimal.py)
* [Prefix Conversions](conversions/prefix_conversions.py)
* [Prefix Conversions String](conversions/prefix_conversions_string.py)
* [Pressure Conversions](conversions/pressure_conversions.py)
* [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py)
* [Roman Numerals](conversions/roman_numerals.py)
* [Temperature Conversions](conversions/temperature_conversions.py)
* [Volume Conversions](conversions/volume_conversions.py)
* [Weight Conversion](conversions/weight_conversion.py)
## Data Structures
* Binary Tree
* [Avl Tree](data_structures/binary_tree/avl_tree.py)
* [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py)
* [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py)
* [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py)
* [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py)
* [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py)
* [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py)
* [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py)
* [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py)
* [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py)
* [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py)
* [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py)
* [Red Black Tree](data_structures/binary_tree/red_black_tree.py)
* [Segment Tree](data_structures/binary_tree/segment_tree.py)
* [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py)
* [Treap](data_structures/binary_tree/treap.py)
* [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py)
* Disjoint Set
* [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py)
* [Disjoint Set](data_structures/disjoint_set/disjoint_set.py)
* Hashing
* [Double Hash](data_structures/hashing/double_hash.py)
* [Hash Table](data_structures/hashing/hash_table.py)
* [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py)
* Number Theory
* [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py)
* [Quadratic Probing](data_structures/hashing/quadratic_probing.py)
* Heap
* [Binomial Heap](data_structures/heap/binomial_heap.py)
* [Heap](data_structures/heap/heap.py)
* [Heap Generic](data_structures/heap/heap_generic.py)
* [Max Heap](data_structures/heap/max_heap.py)
* [Min Heap](data_structures/heap/min_heap.py)
* [Randomized Heap](data_structures/heap/randomized_heap.py)
* [Skew Heap](data_structures/heap/skew_heap.py)
* Linked List
* [Circular Linked List](data_structures/linked_list/circular_linked_list.py)
* [Deque Doubly](data_structures/linked_list/deque_doubly.py)
* [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py)
* [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py)
* [From Sequence](data_structures/linked_list/from_sequence.py)
* [Has Loop](data_structures/linked_list/has_loop.py)
* [Is Palindrome](data_structures/linked_list/is_palindrome.py)
* [Merge Two Lists](data_structures/linked_list/merge_two_lists.py)
* [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py)
* [Print Reverse](data_structures/linked_list/print_reverse.py)
* [Singly Linked List](data_structures/linked_list/singly_linked_list.py)
* [Skip List](data_structures/linked_list/skip_list.py)
* [Swap Nodes](data_structures/linked_list/swap_nodes.py)
* Queue
* [Circular Queue](data_structures/queue/circular_queue.py)
* [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py)
* [Double Ended Queue](data_structures/queue/double_ended_queue.py)
* [Linked Queue](data_structures/queue/linked_queue.py)
* [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py)
* [Queue On List](data_structures/queue/queue_on_list.py)
* [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py)
* Stacks
* [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py)
* [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py)
* [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py)
* [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py)
* [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py)
* [Next Greater Element](data_structures/stacks/next_greater_element.py)
* [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py)
* [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py)
* [Stack](data_structures/stacks/stack.py)
* [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py)
* [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py)
* [Stock Span Problem](data_structures/stacks/stock_span_problem.py)
* Trie
* [Trie](data_structures/trie/trie.py)
## Digital Image Processing
* [Change Brightness](digital_image_processing/change_brightness.py)
* [Change Contrast](digital_image_processing/change_contrast.py)
* [Convert To Negative](digital_image_processing/convert_to_negative.py)
* Dithering
* [Burkes](digital_image_processing/dithering/burkes.py)
* Edge Detection
* [Canny](digital_image_processing/edge_detection/canny.py)
* Filters
* [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py)
* [Convolve](digital_image_processing/filters/convolve.py)
* [Gabor Filter](digital_image_processing/filters/gabor_filter.py)
* [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py)
* [Median Filter](digital_image_processing/filters/median_filter.py)
* [Sobel Filter](digital_image_processing/filters/sobel_filter.py)
* Histogram Equalization
* [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py)
* [Index Calculation](digital_image_processing/index_calculation.py)
* Morphological Operations
* [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py)
* [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py)
* Resize
* [Resize](digital_image_processing/resize/resize.py)
* Rotation
* [Rotation](digital_image_processing/rotation/rotation.py)
* [Sepia](digital_image_processing/sepia.py)
* [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py)
## Divide And Conquer
* [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py)
* [Convex Hull](divide_and_conquer/convex_hull.py)
* [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py)
* [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py)
* [Inversions](divide_and_conquer/inversions.py)
* [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py)
* [Max Difference Pair](divide_and_conquer/max_difference_pair.py)
* [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py)
* [Mergesort](divide_and_conquer/mergesort.py)
* [Peak](divide_and_conquer/peak.py)
* [Power](divide_and_conquer/power.py)
* [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py)
## Dynamic Programming
* [Abbreviation](dynamic_programming/abbreviation.py)
* [All Construct](dynamic_programming/all_construct.py)
* [Bitmask](dynamic_programming/bitmask.py)
* [Catalan Numbers](dynamic_programming/catalan_numbers.py)
* [Climbing Stairs](dynamic_programming/climbing_stairs.py)
* [Edit Distance](dynamic_programming/edit_distance.py)
* [Factorial](dynamic_programming/factorial.py)
* [Fast Fibonacci](dynamic_programming/fast_fibonacci.py)
* [Fibonacci](dynamic_programming/fibonacci.py)
* [Floyd Warshall](dynamic_programming/floyd_warshall.py)
* [Fractional Knapsack](dynamic_programming/fractional_knapsack.py)
* [Fractional Knapsack 2](dynamic_programming/fractional_knapsack_2.py)
* [Integer Partition](dynamic_programming/integer_partition.py)
* [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py)
* [Knapsack](dynamic_programming/knapsack.py)
* [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py)
* [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py)
* [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py)
* [Longest Sub Array](dynamic_programming/longest_sub_array.py)
* [Matrix Chain Order](dynamic_programming/matrix_chain_order.py)
* [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py)
* [Max Sub Array](dynamic_programming/max_sub_array.py)
* [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py)
* [Minimum Coin Change](dynamic_programming/minimum_coin_change.py)
* [Minimum Cost Path](dynamic_programming/minimum_cost_path.py)
* [Minimum Partition](dynamic_programming/minimum_partition.py)
* [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py)
* [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py)
* [Rod Cutting](dynamic_programming/rod_cutting.py)
* [Subset Generation](dynamic_programming/subset_generation.py)
* [Sum Of Subset](dynamic_programming/sum_of_subset.py)
## Electronics
* [Carrier Concentration](electronics/carrier_concentration.py)
* [Coulombs Law](electronics/coulombs_law.py)
* [Electric Power](electronics/electric_power.py)
* [Ohms Law](electronics/ohms_law.py)
## File Transfer
* [Receive File](file_transfer/receive_file.py)
* [Send File](file_transfer/send_file.py)
* Tests
* [Test Send File](file_transfer/tests/test_send_file.py)
## Financial
* [Equated Monthly Installments](financial/equated_monthly_installments.py)
* [Interest](financial/interest.py)
## Fractals
* [Julia Sets](fractals/julia_sets.py)
* [Koch Snowflake](fractals/koch_snowflake.py)
* [Mandelbrot](fractals/mandelbrot.py)
* [Sierpinski Triangle](fractals/sierpinski_triangle.py)
## Fuzzy Logic
* [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py)
## Genetic Algorithm
* [Basic String](genetic_algorithm/basic_string.py)
## Geodesy
* [Haversine Distance](geodesy/haversine_distance.py)
* [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py)
## Graphics
* [Bezier Curve](graphics/bezier_curve.py)
* [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py)
## Graphs
* [A Star](graphs/a_star.py)
* [Articulation Points](graphs/articulation_points.py)
* [Basic Graphs](graphs/basic_graphs.py)
* [Bellman Ford](graphs/bellman_ford.py)
* [Bfs Shortest Path](graphs/bfs_shortest_path.py)
* [Bfs Zero One Shortest Path](graphs/bfs_zero_one_shortest_path.py)
* [Bidirectional A Star](graphs/bidirectional_a_star.py)
* [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py)
* [Boruvka](graphs/boruvka.py)
* [Breadth First Search](graphs/breadth_first_search.py)
* [Breadth First Search 2](graphs/breadth_first_search_2.py)
* [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py)
* [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py)
* [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py)
* [Check Cycle](graphs/check_cycle.py)
* [Connected Components](graphs/connected_components.py)
* [Depth First Search](graphs/depth_first_search.py)
* [Depth First Search 2](graphs/depth_first_search_2.py)
* [Dijkstra](graphs/dijkstra.py)
* [Dijkstra 2](graphs/dijkstra_2.py)
* [Dijkstra Algorithm](graphs/dijkstra_algorithm.py)
* [Dinic](graphs/dinic.py)
* [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py)
* [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py)
* [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py)
* [Even Tree](graphs/even_tree.py)
* [Finding Bridges](graphs/finding_bridges.py)
* [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py)
* [G Topological Sort](graphs/g_topological_sort.py)
* [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py)
* [Graph List](graphs/graph_list.py)
* [Graph Matrix](graphs/graph_matrix.py)
* [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py)
* [Greedy Best First](graphs/greedy_best_first.py)
* [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py)
* [Kahns Algorithm Long](graphs/kahns_algorithm_long.py)
* [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py)
* [Karger](graphs/karger.py)
* [Markov Chain](graphs/markov_chain.py)
* [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py)
* [Minimum Path Sum](graphs/minimum_path_sum.py)
* [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py)
* [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py)
* [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py)
* [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py)
* [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py)
* [Multi Heuristic Astar](graphs/multi_heuristic_astar.py)
* [Page Rank](graphs/page_rank.py)
* [Prim](graphs/prim.py)
* [Random Graph Generator](graphs/random_graph_generator.py)
* [Scc Kosaraju](graphs/scc_kosaraju.py)
* [Strongly Connected Components](graphs/strongly_connected_components.py)
* [Tarjans Scc](graphs/tarjans_scc.py)
* Tests
* [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py)
* [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py)
## Greedy Methods
* [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py)
## Hashes
* [Adler32](hashes/adler32.py)
* [Chaos Machine](hashes/chaos_machine.py)
* [Djb2](hashes/djb2.py)
* [Enigma Machine](hashes/enigma_machine.py)
* [Hamming Code](hashes/hamming_code.py)
* [Luhn](hashes/luhn.py)
* [Md5](hashes/md5.py)
* [Sdbm](hashes/sdbm.py)
* [Sha1](hashes/sha1.py)
* [Sha256](hashes/sha256.py)
## Knapsack
* [Greedy Knapsack](knapsack/greedy_knapsack.py)
* [Knapsack](knapsack/knapsack.py)
* Tests
* [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py)
* [Test Knapsack](knapsack/tests/test_knapsack.py)
## Linear Algebra
* Src
* [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py)
* [Lib](linear_algebra/src/lib.py)
* [Polynom For Points](linear_algebra/src/polynom_for_points.py)
* [Power Iteration](linear_algebra/src/power_iteration.py)
* [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py)
* [Schur Complement](linear_algebra/src/schur_complement.py)
* [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py)
* [Transformations 2D](linear_algebra/src/transformations_2d.py)
## Machine Learning
* [Astar](machine_learning/astar.py)
* [Data Transformations](machine_learning/data_transformations.py)
* [Decision Tree](machine_learning/decision_tree.py)
* Forecasting
* [Run](machine_learning/forecasting/run.py)
* [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py)
* [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py)
* [Gradient Descent](machine_learning/gradient_descent.py)
* [K Means Clust](machine_learning/k_means_clust.py)
* [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py)
* [Knn Sklearn](machine_learning/knn_sklearn.py)
* [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py)
* [Linear Regression](machine_learning/linear_regression.py)
* Local Weighted Learning
* [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py)
* [Logistic Regression](machine_learning/logistic_regression.py)
* Lstm
* [Lstm Prediction](machine_learning/lstm/lstm_prediction.py)
* [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py)
* [Polymonial Regression](machine_learning/polymonial_regression.py)
* [Random Forest Classifier](machine_learning/random_forest_classifier.py)
* [Random Forest Regressor](machine_learning/random_forest_regressor.py)
* [Scoring Functions](machine_learning/scoring_functions.py)
* [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py)
* [Similarity Search](machine_learning/similarity_search.py)
* [Support Vector Machines](machine_learning/support_vector_machines.py)
* [Word Frequency Functions](machine_learning/word_frequency_functions.py)
## Maths
* [3N Plus 1](maths/3n_plus_1.py)
* [Abs](maths/abs.py)
* [Abs Max](maths/abs_max.py)
* [Abs Min](maths/abs_min.py)
* [Add](maths/add.py)
* [Aliquot Sum](maths/aliquot_sum.py)
* [Allocation Number](maths/allocation_number.py)
* [Area](maths/area.py)
* [Area Under Curve](maths/area_under_curve.py)
* [Armstrong Numbers](maths/armstrong_numbers.py)
* [Average Absolute Deviation](maths/average_absolute_deviation.py)
* [Average Mean](maths/average_mean.py)
* [Average Median](maths/average_median.py)
* [Average Mode](maths/average_mode.py)
* [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py)
* [Basic Maths](maths/basic_maths.py)
* [Binary Exp Mod](maths/binary_exp_mod.py)
* [Binary Exponentiation](maths/binary_exponentiation.py)
* [Binary Exponentiation 2](maths/binary_exponentiation_2.py)
* [Binary Exponentiation 3](maths/binary_exponentiation_3.py)
* [Binomial Coefficient](maths/binomial_coefficient.py)
* [Binomial Distribution](maths/binomial_distribution.py)
* [Bisection](maths/bisection.py)
* [Ceil](maths/ceil.py)
* [Check Polygon](maths/check_polygon.py)
* [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py)
* [Collatz Sequence](maths/collatz_sequence.py)
* [Combinations](maths/combinations.py)
* [Decimal Isolate](maths/decimal_isolate.py)
* [Double Factorial Iterative](maths/double_factorial_iterative.py)
* [Double Factorial Recursive](maths/double_factorial_recursive.py)
* [Entropy](maths/entropy.py)
* [Euclidean Distance](maths/euclidean_distance.py)
* [Euclidean Gcd](maths/euclidean_gcd.py)
* [Euler Method](maths/euler_method.py)
* [Euler Modified](maths/euler_modified.py)
* [Eulers Totient](maths/eulers_totient.py)
* [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py)
* [Factorial Iterative](maths/factorial_iterative.py)
* [Factorial Recursive](maths/factorial_recursive.py)
* [Factors](maths/factors.py)
* [Fermat Little Theorem](maths/fermat_little_theorem.py)
* [Fibonacci](maths/fibonacci.py)
* [Find Max](maths/find_max.py)
* [Find Max Recursion](maths/find_max_recursion.py)
* [Find Min](maths/find_min.py)
* [Find Min Recursion](maths/find_min_recursion.py)
* [Floor](maths/floor.py)
* [Gamma](maths/gamma.py)
* [Gamma Recursive](maths/gamma_recursive.py)
* [Gaussian](maths/gaussian.py)
* [Greatest Common Divisor](maths/greatest_common_divisor.py)
* [Greedy Coin Change](maths/greedy_coin_change.py)
* [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py)
* [Integration By Simpson Approx](maths/integration_by_simpson_approx.py)
* [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py)
* [Is Square Free](maths/is_square_free.py)
* [Jaccard Similarity](maths/jaccard_similarity.py)
* [Kadanes](maths/kadanes.py)
* [Karatsuba](maths/karatsuba.py)
* [Krishnamurthy Number](maths/krishnamurthy_number.py)
* [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py)
* [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py)
* [Largest Subarray Sum](maths/largest_subarray_sum.py)
* [Least Common Multiple](maths/least_common_multiple.py)
* [Line Length](maths/line_length.py)
* [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py)
* [Lucas Series](maths/lucas_series.py)
* [Matrix Exponentiation](maths/matrix_exponentiation.py)
* [Max Sum Sliding Window](maths/max_sum_sliding_window.py)
* [Median Of Two Arrays](maths/median_of_two_arrays.py)
* [Miller Rabin](maths/miller_rabin.py)
* [Mobius Function](maths/mobius_function.py)
* [Modular Exponential](maths/modular_exponential.py)
* [Monte Carlo](maths/monte_carlo.py)
* [Monte Carlo Dice](maths/monte_carlo_dice.py)
* [Nevilles Method](maths/nevilles_method.py)
* [Newton Raphson](maths/newton_raphson.py)
* [Number Of Digits](maths/number_of_digits.py)
* [Numerical Integration](maths/numerical_integration.py)
* [Perfect Cube](maths/perfect_cube.py)
* [Perfect Number](maths/perfect_number.py)
* [Perfect Square](maths/perfect_square.py)
* [Persistence](maths/persistence.py)
* [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py)
* [Points Are Collinear 3D](maths/points_are_collinear_3d.py)
* [Pollard Rho](maths/pollard_rho.py)
* [Polynomial Evaluation](maths/polynomial_evaluation.py)
* [Power Using Recursion](maths/power_using_recursion.py)
* [Prime Check](maths/prime_check.py)
* [Prime Factors](maths/prime_factors.py)
* [Prime Numbers](maths/prime_numbers.py)
* [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py)
* [Primelib](maths/primelib.py)
* [Proth Number](maths/proth_number.py)
* [Pythagoras](maths/pythagoras.py)
* [Qr Decomposition](maths/qr_decomposition.py)
* [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py)
* [Radians](maths/radians.py)
* [Radix2 Fft](maths/radix2_fft.py)
* [Relu](maths/relu.py)
* [Runge Kutta](maths/runge_kutta.py)
* [Segmented Sieve](maths/segmented_sieve.py)
* Series
* [Arithmetic](maths/series/arithmetic.py)
* [Geometric](maths/series/geometric.py)
* [Geometric Series](maths/series/geometric_series.py)
* [Harmonic](maths/series/harmonic.py)
* [Harmonic Series](maths/series/harmonic_series.py)
* [Hexagonal Numbers](maths/series/hexagonal_numbers.py)
* [P Series](maths/series/p_series.py)
* [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py)
* [Sigmoid](maths/sigmoid.py)
* [Simpson Rule](maths/simpson_rule.py)
* [Sin](maths/sin.py)
* [Sock Merchant](maths/sock_merchant.py)
* [Softmax](maths/softmax.py)
* [Square Root](maths/square_root.py)
* [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py)
* [Sum Of Digits](maths/sum_of_digits.py)
* [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py)
* [Sylvester Sequence](maths/sylvester_sequence.py)
* [Test Prime Check](maths/test_prime_check.py)
* [Trapezoidal Rule](maths/trapezoidal_rule.py)
* [Triplet Sum](maths/triplet_sum.py)
* [Two Pointer](maths/two_pointer.py)
* [Two Sum](maths/two_sum.py)
* [Ugly Numbers](maths/ugly_numbers.py)
* [Volume](maths/volume.py)
* [Zellers Congruence](maths/zellers_congruence.py)
## Matrix
* [Count Islands In Matrix](matrix/count_islands_in_matrix.py)
* [Inverse Of Matrix](matrix/inverse_of_matrix.py)
* [Matrix Class](matrix/matrix_class.py)
* [Matrix Operation](matrix/matrix_operation.py)
* [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py)
* [Rotate Matrix](matrix/rotate_matrix.py)
* [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py)
* [Sherman Morrison](matrix/sherman_morrison.py)
* [Spiral Print](matrix/spiral_print.py)
* Tests
* [Test Matrix Operation](matrix/tests/test_matrix_operation.py)
## Networking Flow
* [Ford Fulkerson](networking_flow/ford_fulkerson.py)
* [Minimum Cut](networking_flow/minimum_cut.py)
## Neural Network
* [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py)
* [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py)
* [Convolution Neural Network](neural_network/convolution_neural_network.py)
* [Perceptron](neural_network/perceptron.py)
## Other
* [Activity Selection](other/activity_selection.py)
* [Alternative List Arrange](other/alternative_list_arrange.py)
* [Check Strong Password](other/check_strong_password.py)
* [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py)
* [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py)
* [Doomsday](other/doomsday.py)
* [Fischer Yates Shuffle](other/fischer_yates_shuffle.py)
* [Gauss Easter](other/gauss_easter.py)
* [Graham Scan](other/graham_scan.py)
* [Greedy](other/greedy.py)
* [Least Recently Used](other/least_recently_used.py)
* [Lfu Cache](other/lfu_cache.py)
* [Linear Congruential Generator](other/linear_congruential_generator.py)
* [Lru Cache](other/lru_cache.py)
* [Magicdiamondpattern](other/magicdiamondpattern.py)
* [Nested Brackets](other/nested_brackets.py)
* [Password Generator](other/password_generator.py)
* [Scoring Algorithm](other/scoring_algorithm.py)
* [Sdes](other/sdes.py)
* [Tower Of Hanoi](other/tower_of_hanoi.py)
## Physics
* [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py)
* [Lorenz Transformation Four Vector](physics/lorenz_transformation_four_vector.py)
* [N Body Simulation](physics/n_body_simulation.py)
* [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py)
## Project Euler
* Problem 001
* [Sol1](project_euler/problem_001/sol1.py)
* [Sol2](project_euler/problem_001/sol2.py)
* [Sol3](project_euler/problem_001/sol3.py)
* [Sol4](project_euler/problem_001/sol4.py)
* [Sol5](project_euler/problem_001/sol5.py)
* [Sol6](project_euler/problem_001/sol6.py)
* [Sol7](project_euler/problem_001/sol7.py)
* Problem 002
* [Sol1](project_euler/problem_002/sol1.py)
* [Sol2](project_euler/problem_002/sol2.py)
* [Sol3](project_euler/problem_002/sol3.py)
* [Sol4](project_euler/problem_002/sol4.py)
* [Sol5](project_euler/problem_002/sol5.py)
* Problem 003
* [Sol1](project_euler/problem_003/sol1.py)
* [Sol2](project_euler/problem_003/sol2.py)
* [Sol3](project_euler/problem_003/sol3.py)
* Problem 004
* [Sol1](project_euler/problem_004/sol1.py)
* [Sol2](project_euler/problem_004/sol2.py)
* Problem 005
* [Sol1](project_euler/problem_005/sol1.py)
* [Sol2](project_euler/problem_005/sol2.py)
* Problem 006
* [Sol1](project_euler/problem_006/sol1.py)
* [Sol2](project_euler/problem_006/sol2.py)
* [Sol3](project_euler/problem_006/sol3.py)
* [Sol4](project_euler/problem_006/sol4.py)
* Problem 007
* [Sol1](project_euler/problem_007/sol1.py)
* [Sol2](project_euler/problem_007/sol2.py)
* [Sol3](project_euler/problem_007/sol3.py)
* Problem 008
* [Sol1](project_euler/problem_008/sol1.py)
* [Sol2](project_euler/problem_008/sol2.py)
* [Sol3](project_euler/problem_008/sol3.py)
* Problem 009
* [Sol1](project_euler/problem_009/sol1.py)
* [Sol2](project_euler/problem_009/sol2.py)
* [Sol3](project_euler/problem_009/sol3.py)
* Problem 010
* [Sol1](project_euler/problem_010/sol1.py)
* [Sol2](project_euler/problem_010/sol2.py)
* [Sol3](project_euler/problem_010/sol3.py)
* Problem 011
* [Sol1](project_euler/problem_011/sol1.py)
* [Sol2](project_euler/problem_011/sol2.py)
* Problem 012
* [Sol1](project_euler/problem_012/sol1.py)
* [Sol2](project_euler/problem_012/sol2.py)
* Problem 013
* [Sol1](project_euler/problem_013/sol1.py)
* Problem 014
* [Sol1](project_euler/problem_014/sol1.py)
* [Sol2](project_euler/problem_014/sol2.py)
* Problem 015
* [Sol1](project_euler/problem_015/sol1.py)
* Problem 016
* [Sol1](project_euler/problem_016/sol1.py)
* [Sol2](project_euler/problem_016/sol2.py)
* Problem 017
* [Sol1](project_euler/problem_017/sol1.py)
* Problem 018
* [Solution](project_euler/problem_018/solution.py)
* Problem 019
* [Sol1](project_euler/problem_019/sol1.py)
* Problem 020
* [Sol1](project_euler/problem_020/sol1.py)
* [Sol2](project_euler/problem_020/sol2.py)
* [Sol3](project_euler/problem_020/sol3.py)
* [Sol4](project_euler/problem_020/sol4.py)
* Problem 021
* [Sol1](project_euler/problem_021/sol1.py)
* Problem 022
* [Sol1](project_euler/problem_022/sol1.py)
* [Sol2](project_euler/problem_022/sol2.py)
* Problem 023
* [Sol1](project_euler/problem_023/sol1.py)
* Problem 024
* [Sol1](project_euler/problem_024/sol1.py)
* Problem 025
* [Sol1](project_euler/problem_025/sol1.py)
* [Sol2](project_euler/problem_025/sol2.py)
* [Sol3](project_euler/problem_025/sol3.py)
* Problem 026
* [Sol1](project_euler/problem_026/sol1.py)
* Problem 027
* [Sol1](project_euler/problem_027/sol1.py)
* Problem 028
* [Sol1](project_euler/problem_028/sol1.py)
* Problem 029
* [Sol1](project_euler/problem_029/sol1.py)
* Problem 030
* [Sol1](project_euler/problem_030/sol1.py)
* Problem 031
* [Sol1](project_euler/problem_031/sol1.py)
* [Sol2](project_euler/problem_031/sol2.py)
* Problem 032
* [Sol32](project_euler/problem_032/sol32.py)
* Problem 033
* [Sol1](project_euler/problem_033/sol1.py)
* Problem 034
* [Sol1](project_euler/problem_034/sol1.py)
* Problem 035
* [Sol1](project_euler/problem_035/sol1.py)
* Problem 036
* [Sol1](project_euler/problem_036/sol1.py)
* Problem 037
* [Sol1](project_euler/problem_037/sol1.py)
* Problem 038
* [Sol1](project_euler/problem_038/sol1.py)
* Problem 039
* [Sol1](project_euler/problem_039/sol1.py)
* Problem 040
* [Sol1](project_euler/problem_040/sol1.py)
* Problem 041
* [Sol1](project_euler/problem_041/sol1.py)
* Problem 042
* [Solution42](project_euler/problem_042/solution42.py)
* Problem 043
* [Sol1](project_euler/problem_043/sol1.py)
* Problem 044
* [Sol1](project_euler/problem_044/sol1.py)
* Problem 045
* [Sol1](project_euler/problem_045/sol1.py)
* Problem 046
* [Sol1](project_euler/problem_046/sol1.py)
* Problem 047
* [Sol1](project_euler/problem_047/sol1.py)
* Problem 048
* [Sol1](project_euler/problem_048/sol1.py)
* Problem 049
* [Sol1](project_euler/problem_049/sol1.py)
* Problem 050
* [Sol1](project_euler/problem_050/sol1.py)
* Problem 051
* [Sol1](project_euler/problem_051/sol1.py)
* Problem 052
* [Sol1](project_euler/problem_052/sol1.py)
* Problem 053
* [Sol1](project_euler/problem_053/sol1.py)
* Problem 054
* [Sol1](project_euler/problem_054/sol1.py)
* [Test Poker Hand](project_euler/problem_054/test_poker_hand.py)
* Problem 055
* [Sol1](project_euler/problem_055/sol1.py)
* Problem 056
* [Sol1](project_euler/problem_056/sol1.py)
* Problem 057
* [Sol1](project_euler/problem_057/sol1.py)
* Problem 058
* [Sol1](project_euler/problem_058/sol1.py)
* Problem 059
* [Sol1](project_euler/problem_059/sol1.py)
* Problem 062
* [Sol1](project_euler/problem_062/sol1.py)
* Problem 063
* [Sol1](project_euler/problem_063/sol1.py)
* Problem 064
* [Sol1](project_euler/problem_064/sol1.py)
* Problem 065
* [Sol1](project_euler/problem_065/sol1.py)
* Problem 067
* [Sol1](project_euler/problem_067/sol1.py)
* [Sol2](project_euler/problem_067/sol2.py)
* Problem 068
* [Sol1](project_euler/problem_068/sol1.py)
* Problem 069
* [Sol1](project_euler/problem_069/sol1.py)
* Problem 070
* [Sol1](project_euler/problem_070/sol1.py)
* Problem 071
* [Sol1](project_euler/problem_071/sol1.py)
* Problem 072
* [Sol1](project_euler/problem_072/sol1.py)
* [Sol2](project_euler/problem_072/sol2.py)
* Problem 074
* [Sol1](project_euler/problem_074/sol1.py)
* [Sol2](project_euler/problem_074/sol2.py)
* Problem 075
* [Sol1](project_euler/problem_075/sol1.py)
* Problem 076
* [Sol1](project_euler/problem_076/sol1.py)
* Problem 077
* [Sol1](project_euler/problem_077/sol1.py)
* Problem 078
* [Sol1](project_euler/problem_078/sol1.py)
* Problem 080
* [Sol1](project_euler/problem_080/sol1.py)
* Problem 081
* [Sol1](project_euler/problem_081/sol1.py)
* Problem 085
* [Sol1](project_euler/problem_085/sol1.py)
* Problem 086
* [Sol1](project_euler/problem_086/sol1.py)
* Problem 087
* [Sol1](project_euler/problem_087/sol1.py)
* Problem 089
* [Sol1](project_euler/problem_089/sol1.py)
* Problem 091
* [Sol1](project_euler/problem_091/sol1.py)
* Problem 092
* [Sol1](project_euler/problem_092/sol1.py)
* Problem 097
* [Sol1](project_euler/problem_097/sol1.py)
* Problem 099
* [Sol1](project_euler/problem_099/sol1.py)
* Problem 101
* [Sol1](project_euler/problem_101/sol1.py)
* Problem 102
* [Sol1](project_euler/problem_102/sol1.py)
* Problem 104
* [Sol](project_euler/problem_104/sol.py)
* Problem 107
* [Sol1](project_euler/problem_107/sol1.py)
* Problem 109
* [Sol1](project_euler/problem_109/sol1.py)
* Problem 112
* [Sol1](project_euler/problem_112/sol1.py)
* Problem 113
* [Sol1](project_euler/problem_113/sol1.py)
* Problem 119
* [Sol1](project_euler/problem_119/sol1.py)
* Problem 120
* [Sol1](project_euler/problem_120/sol1.py)
* Problem 121
* [Sol1](project_euler/problem_121/sol1.py)
* Problem 123
* [Sol1](project_euler/problem_123/sol1.py)
* Problem 125
* [Sol1](project_euler/problem_125/sol1.py)
* Problem 129
* [Sol1](project_euler/problem_129/sol1.py)
* Problem 135
* [Sol1](project_euler/problem_135/sol1.py)
* Problem 144
* [Sol1](project_euler/problem_144/sol1.py)
* Problem 145
* [Sol1](project_euler/problem_145/sol1.py)
* Problem 173
* [Sol1](project_euler/problem_173/sol1.py)
* Problem 174
* [Sol1](project_euler/problem_174/sol1.py)
* Problem 180
* [Sol1](project_euler/problem_180/sol1.py)
* Problem 188
* [Sol1](project_euler/problem_188/sol1.py)
* Problem 191
* [Sol1](project_euler/problem_191/sol1.py)
* Problem 203
* [Sol1](project_euler/problem_203/sol1.py)
* Problem 205
* [Sol1](project_euler/problem_205/sol1.py)
* Problem 206
* [Sol1](project_euler/problem_206/sol1.py)
* Problem 207
* [Sol1](project_euler/problem_207/sol1.py)
* Problem 234
* [Sol1](project_euler/problem_234/sol1.py)
* Problem 301
* [Sol1](project_euler/problem_301/sol1.py)
* Problem 493
* [Sol1](project_euler/problem_493/sol1.py)
* Problem 551
* [Sol1](project_euler/problem_551/sol1.py)
* Problem 686
* [Sol1](project_euler/problem_686/sol1.py)
## Quantum
* [Deutsch Jozsa](quantum/deutsch_jozsa.py)
* [Half Adder](quantum/half_adder.py)
* [Not Gate](quantum/not_gate.py)
* [Quantum Entanglement](quantum/quantum_entanglement.py)
* [Ripple Adder Classic](quantum/ripple_adder_classic.py)
* [Single Qubit Measure](quantum/single_qubit_measure.py)
## Scheduling
* [First Come First Served](scheduling/first_come_first_served.py)
* [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py)
* [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py)
* [Round Robin](scheduling/round_robin.py)
* [Shortest Job First](scheduling/shortest_job_first.py)
## Searches
* [Binary Search](searches/binary_search.py)
* [Binary Tree Traversal](searches/binary_tree_traversal.py)
* [Double Linear Search](searches/double_linear_search.py)
* [Double Linear Search Recursion](searches/double_linear_search_recursion.py)
* [Fibonacci Search](searches/fibonacci_search.py)
* [Hill Climbing](searches/hill_climbing.py)
* [Interpolation Search](searches/interpolation_search.py)
* [Jump Search](searches/jump_search.py)
* [Linear Search](searches/linear_search.py)
* [Quick Select](searches/quick_select.py)
* [Sentinel Linear Search](searches/sentinel_linear_search.py)
* [Simple Binary Search](searches/simple_binary_search.py)
* [Simulated Annealing](searches/simulated_annealing.py)
* [Tabu Search](searches/tabu_search.py)
* [Ternary Search](searches/ternary_search.py)
## Sorts
* [Bead Sort](sorts/bead_sort.py)
* [Bitonic Sort](sorts/bitonic_sort.py)
* [Bogo Sort](sorts/bogo_sort.py)
* [Bubble Sort](sorts/bubble_sort.py)
* [Bucket Sort](sorts/bucket_sort.py)
* [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py)
* [Comb Sort](sorts/comb_sort.py)
* [Counting Sort](sorts/counting_sort.py)
* [Cycle Sort](sorts/cycle_sort.py)
* [Double Sort](sorts/double_sort.py)
* [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py)
* [Exchange Sort](sorts/exchange_sort.py)
* [External Sort](sorts/external_sort.py)
* [Gnome Sort](sorts/gnome_sort.py)
* [Heap Sort](sorts/heap_sort.py)
* [Insertion Sort](sorts/insertion_sort.py)
* [Intro Sort](sorts/intro_sort.py)
* [Iterative Merge Sort](sorts/iterative_merge_sort.py)
* [Merge Insertion Sort](sorts/merge_insertion_sort.py)
* [Merge Sort](sorts/merge_sort.py)
* [Msd Radix Sort](sorts/msd_radix_sort.py)
* [Natural Sort](sorts/natural_sort.py)
* [Odd Even Sort](sorts/odd_even_sort.py)
* [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py)
* [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py)
* [Pancake Sort](sorts/pancake_sort.py)
* [Patience Sort](sorts/patience_sort.py)
* [Pigeon Sort](sorts/pigeon_sort.py)
* [Pigeonhole Sort](sorts/pigeonhole_sort.py)
* [Quick Sort](sorts/quick_sort.py)
* [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py)
* [Radix Sort](sorts/radix_sort.py)
* [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py)
* [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py)
* [Recursive Bubble Sort](sorts/recursive_bubble_sort.py)
* [Recursive Insertion Sort](sorts/recursive_insertion_sort.py)
* [Recursive Mergesort Array](sorts/recursive_mergesort_array.py)
* [Recursive Quick Sort](sorts/recursive_quick_sort.py)
* [Selection Sort](sorts/selection_sort.py)
* [Shell Sort](sorts/shell_sort.py)
* [Slowsort](sorts/slowsort.py)
* [Stooge Sort](sorts/stooge_sort.py)
* [Strand Sort](sorts/strand_sort.py)
* [Tim Sort](sorts/tim_sort.py)
* [Topological Sort](sorts/topological_sort.py)
* [Tree Sort](sorts/tree_sort.py)
* [Unknown Sort](sorts/unknown_sort.py)
* [Wiggle Sort](sorts/wiggle_sort.py)
## Strings
* [Aho Corasick](strings/aho_corasick.py)
* [Alternative String Arrange](strings/alternative_string_arrange.py)
* [Anagrams](strings/anagrams.py)
* [Autocomplete Using Trie](strings/autocomplete_using_trie.py)
* [Boyer Moore Search](strings/boyer_moore_search.py)
* [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py)
* [Capitalize](strings/capitalize.py)
* [Check Anagrams](strings/check_anagrams.py)
* [Check Pangram](strings/check_pangram.py)
* [Credit Card Validator](strings/credit_card_validator.py)
* [Detecting English Programmatically](strings/detecting_english_programmatically.py)
* [Frequency Finder](strings/frequency_finder.py)
* [Indian Phone Validator](strings/indian_phone_validator.py)
* [Is Contains Unique Chars](strings/is_contains_unique_chars.py)
* [Is Palindrome](strings/is_palindrome.py)
* [Jaro Winkler](strings/jaro_winkler.py)
* [Join](strings/join.py)
* [Knuth Morris Pratt](strings/knuth_morris_pratt.py)
* [Levenshtein Distance](strings/levenshtein_distance.py)
* [Lower](strings/lower.py)
* [Manacher](strings/manacher.py)
* [Min Cost String Conversion](strings/min_cost_string_conversion.py)
* [Naive String Search](strings/naive_string_search.py)
* [Ngram](strings/ngram.py)
* [Palindrome](strings/palindrome.py)
* [Prefix Function](strings/prefix_function.py)
* [Rabin Karp](strings/rabin_karp.py)
* [Remove Duplicate](strings/remove_duplicate.py)
* [Reverse Letters](strings/reverse_letters.py)
* [Reverse Long Words](strings/reverse_long_words.py)
* [Reverse Words](strings/reverse_words.py)
* [Split](strings/split.py)
* [Upper](strings/upper.py)
* [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py)
* [Word Occurrence](strings/word_occurrence.py)
* [Word Patterns](strings/word_patterns.py)
* [Z Function](strings/z_function.py)
## Web Programming
* [Co2 Emission](web_programming/co2_emission.py)
* [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py)
* [Crawl Google Results](web_programming/crawl_google_results.py)
* [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py)
* [Currency Converter](web_programming/currency_converter.py)
* [Current Stock Price](web_programming/current_stock_price.py)
* [Current Weather](web_programming/current_weather.py)
* [Daily Horoscope](web_programming/daily_horoscope.py)
* [Download Images From Google Query](web_programming/download_images_from_google_query.py)
* [Emails From Url](web_programming/emails_from_url.py)
* [Fetch Anime And Play](web_programming/fetch_anime_and_play.py)
* [Fetch Bbc News](web_programming/fetch_bbc_news.py)
* [Fetch Github Info](web_programming/fetch_github_info.py)
* [Fetch Jobs](web_programming/fetch_jobs.py)
* [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py)
* [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py)
* [Get Imdbtop](web_programming/get_imdbtop.py)
* [Get Top Hn Posts](web_programming/get_top_hn_posts.py)
* [Get User Tweets](web_programming/get_user_tweets.py)
* [Giphy](web_programming/giphy.py)
* [Instagram Crawler](web_programming/instagram_crawler.py)
* [Instagram Pic](web_programming/instagram_pic.py)
* [Instagram Video](web_programming/instagram_video.py)
* [Nasa Data](web_programming/nasa_data.py)
* [Random Anime Character](web_programming/random_anime_character.py)
* [Recaptcha Verification](web_programming/recaptcha_verification.py)
* [Reddit](web_programming/reddit.py)
* [Search Books By Isbn](web_programming/search_books_by_isbn.py)
* [Slack Message](web_programming/slack_message.py)
* [Test Fetch Github Info](web_programming/test_fetch_github_info.py)
* [World Covid19 Stats](web_programming/world_covid19_stats.py)
|
## Arithmetic Analysis
* [Bisection](arithmetic_analysis/bisection.py)
* [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py)
* [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py)
* [Intersection](arithmetic_analysis/intersection.py)
* [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py)
* [Lu Decomposition](arithmetic_analysis/lu_decomposition.py)
* [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py)
* [Newton Method](arithmetic_analysis/newton_method.py)
* [Newton Raphson](arithmetic_analysis/newton_raphson.py)
* [Secant Method](arithmetic_analysis/secant_method.py)
## Audio Filters
* [Butterworth Filter](audio_filters/butterworth_filter.py)
* [Iir Filter](audio_filters/iir_filter.py)
* [Show Response](audio_filters/show_response.py)
## Backtracking
* [All Combinations](backtracking/all_combinations.py)
* [All Permutations](backtracking/all_permutations.py)
* [All Subsequences](backtracking/all_subsequences.py)
* [Coloring](backtracking/coloring.py)
* [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py)
* [Knight Tour](backtracking/knight_tour.py)
* [Minimax](backtracking/minimax.py)
* [N Queens](backtracking/n_queens.py)
* [N Queens Math](backtracking/n_queens_math.py)
* [Rat In Maze](backtracking/rat_in_maze.py)
* [Sudoku](backtracking/sudoku.py)
* [Sum Of Subsets](backtracking/sum_of_subsets.py)
## Bit Manipulation
* [Binary And Operator](bit_manipulation/binary_and_operator.py)
* [Binary Count Setbits](bit_manipulation/binary_count_setbits.py)
* [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py)
* [Binary Or Operator](bit_manipulation/binary_or_operator.py)
* [Binary Shifts](bit_manipulation/binary_shifts.py)
* [Binary Twos Complement](bit_manipulation/binary_twos_complement.py)
* [Binary Xor Operator](bit_manipulation/binary_xor_operator.py)
* [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py)
* [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py)
* [Gray Code Sequence](bit_manipulation/gray_code_sequence.py)
* [Reverse Bits](bit_manipulation/reverse_bits.py)
* [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py)
## Blockchain
* [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py)
* [Diophantine Equation](blockchain/diophantine_equation.py)
* [Modular Division](blockchain/modular_division.py)
## Boolean Algebra
* [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py)
## Cellular Automata
* [Conways Game Of Life](cellular_automata/conways_game_of_life.py)
* [Game Of Life](cellular_automata/game_of_life.py)
* [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py)
* [One Dimensional](cellular_automata/one_dimensional.py)
## Ciphers
* [A1Z26](ciphers/a1z26.py)
* [Affine Cipher](ciphers/affine_cipher.py)
* [Atbash](ciphers/atbash.py)
* [Baconian Cipher](ciphers/baconian_cipher.py)
* [Base16](ciphers/base16.py)
* [Base32](ciphers/base32.py)
* [Base64](ciphers/base64.py)
* [Base85](ciphers/base85.py)
* [Beaufort Cipher](ciphers/beaufort_cipher.py)
* [Bifid](ciphers/bifid.py)
* [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py)
* [Caesar Cipher](ciphers/caesar_cipher.py)
* [Cryptomath Module](ciphers/cryptomath_module.py)
* [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py)
* [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py)
* [Diffie](ciphers/diffie.py)
* [Diffie Hellman](ciphers/diffie_hellman.py)
* [Elgamal Key Generator](ciphers/elgamal_key_generator.py)
* [Enigma Machine2](ciphers/enigma_machine2.py)
* [Hill Cipher](ciphers/hill_cipher.py)
* [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py)
* [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py)
* [Morse Code](ciphers/morse_code.py)
* [Onepad Cipher](ciphers/onepad_cipher.py)
* [Playfair Cipher](ciphers/playfair_cipher.py)
* [Polybius](ciphers/polybius.py)
* [Porta Cipher](ciphers/porta_cipher.py)
* [Rabin Miller](ciphers/rabin_miller.py)
* [Rail Fence Cipher](ciphers/rail_fence_cipher.py)
* [Rot13](ciphers/rot13.py)
* [Rsa Cipher](ciphers/rsa_cipher.py)
* [Rsa Factorization](ciphers/rsa_factorization.py)
* [Rsa Key Generator](ciphers/rsa_key_generator.py)
* [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py)
* [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py)
* [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py)
* [Trafid Cipher](ciphers/trafid_cipher.py)
* [Transposition Cipher](ciphers/transposition_cipher.py)
* [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py)
* [Vigenere Cipher](ciphers/vigenere_cipher.py)
* [Xor Cipher](ciphers/xor_cipher.py)
## Compression
* [Burrows Wheeler](compression/burrows_wheeler.py)
* [Huffman](compression/huffman.py)
* [Lempel Ziv](compression/lempel_ziv.py)
* [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py)
* [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py)
## Computer Vision
* [Cnn Classification](computer_vision/cnn_classification.py)
* [Flip Augmentation](computer_vision/flip_augmentation.py)
* [Harris Corner](computer_vision/harris_corner.py)
* [Horn Schunck](computer_vision/horn_schunck.py)
* [Mean Threshold](computer_vision/mean_threshold.py)
* [Mosaic Augmentation](computer_vision/mosaic_augmentation.py)
* [Pooling Functions](computer_vision/pooling_functions.py)
## Conversions
* [Binary To Decimal](conversions/binary_to_decimal.py)
* [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py)
* [Binary To Octal](conversions/binary_to_octal.py)
* [Decimal To Any](conversions/decimal_to_any.py)
* [Decimal To Binary](conversions/decimal_to_binary.py)
* [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py)
* [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py)
* [Decimal To Octal](conversions/decimal_to_octal.py)
* [Excel Title To Column](conversions/excel_title_to_column.py)
* [Hex To Bin](conversions/hex_to_bin.py)
* [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py)
* [Length Conversion](conversions/length_conversion.py)
* [Molecular Chemistry](conversions/molecular_chemistry.py)
* [Octal To Decimal](conversions/octal_to_decimal.py)
* [Prefix Conversions](conversions/prefix_conversions.py)
* [Prefix Conversions String](conversions/prefix_conversions_string.py)
* [Pressure Conversions](conversions/pressure_conversions.py)
* [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py)
* [Roman Numerals](conversions/roman_numerals.py)
* [Temperature Conversions](conversions/temperature_conversions.py)
* [Volume Conversions](conversions/volume_conversions.py)
* [Weight Conversion](conversions/weight_conversion.py)
## Data Structures
* Binary Tree
* [Avl Tree](data_structures/binary_tree/avl_tree.py)
* [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py)
* [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py)
* [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py)
* [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py)
* [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py)
* [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py)
* [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py)
* [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py)
* [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py)
* [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py)
* [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py)
* [Red Black Tree](data_structures/binary_tree/red_black_tree.py)
* [Segment Tree](data_structures/binary_tree/segment_tree.py)
* [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py)
* [Treap](data_structures/binary_tree/treap.py)
* [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py)
* Disjoint Set
* [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py)
* [Disjoint Set](data_structures/disjoint_set/disjoint_set.py)
* Hashing
* [Double Hash](data_structures/hashing/double_hash.py)
* [Hash Table](data_structures/hashing/hash_table.py)
* [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py)
* Number Theory
* [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py)
* [Quadratic Probing](data_structures/hashing/quadratic_probing.py)
* Heap
* [Binomial Heap](data_structures/heap/binomial_heap.py)
* [Heap](data_structures/heap/heap.py)
* [Heap Generic](data_structures/heap/heap_generic.py)
* [Max Heap](data_structures/heap/max_heap.py)
* [Min Heap](data_structures/heap/min_heap.py)
* [Randomized Heap](data_structures/heap/randomized_heap.py)
* [Skew Heap](data_structures/heap/skew_heap.py)
* Linked List
* [Circular Linked List](data_structures/linked_list/circular_linked_list.py)
* [Deque Doubly](data_structures/linked_list/deque_doubly.py)
* [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py)
* [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py)
* [From Sequence](data_structures/linked_list/from_sequence.py)
* [Has Loop](data_structures/linked_list/has_loop.py)
* [Is Palindrome](data_structures/linked_list/is_palindrome.py)
* [Merge Two Lists](data_structures/linked_list/merge_two_lists.py)
* [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py)
* [Print Reverse](data_structures/linked_list/print_reverse.py)
* [Singly Linked List](data_structures/linked_list/singly_linked_list.py)
* [Skip List](data_structures/linked_list/skip_list.py)
* [Swap Nodes](data_structures/linked_list/swap_nodes.py)
* Queue
* [Circular Queue](data_structures/queue/circular_queue.py)
* [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py)
* [Double Ended Queue](data_structures/queue/double_ended_queue.py)
* [Linked Queue](data_structures/queue/linked_queue.py)
* [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py)
* [Queue On List](data_structures/queue/queue_on_list.py)
* [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py)
* Stacks
* [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py)
* [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py)
* [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py)
* [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py)
* [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py)
* [Next Greater Element](data_structures/stacks/next_greater_element.py)
* [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py)
* [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py)
* [Stack](data_structures/stacks/stack.py)
* [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py)
* [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py)
* [Stock Span Problem](data_structures/stacks/stock_span_problem.py)
* Trie
* [Trie](data_structures/trie/trie.py)
## Digital Image Processing
* [Change Brightness](digital_image_processing/change_brightness.py)
* [Change Contrast](digital_image_processing/change_contrast.py)
* [Convert To Negative](digital_image_processing/convert_to_negative.py)
* Dithering
* [Burkes](digital_image_processing/dithering/burkes.py)
* Edge Detection
* [Canny](digital_image_processing/edge_detection/canny.py)
* Filters
* [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py)
* [Convolve](digital_image_processing/filters/convolve.py)
* [Gabor Filter](digital_image_processing/filters/gabor_filter.py)
* [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py)
* [Median Filter](digital_image_processing/filters/median_filter.py)
* [Sobel Filter](digital_image_processing/filters/sobel_filter.py)
* Histogram Equalization
* [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py)
* [Index Calculation](digital_image_processing/index_calculation.py)
* Morphological Operations
* [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py)
* [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py)
* Resize
* [Resize](digital_image_processing/resize/resize.py)
* Rotation
* [Rotation](digital_image_processing/rotation/rotation.py)
* [Sepia](digital_image_processing/sepia.py)
* [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py)
## Divide And Conquer
* [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py)
* [Convex Hull](divide_and_conquer/convex_hull.py)
* [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py)
* [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py)
* [Inversions](divide_and_conquer/inversions.py)
* [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py)
* [Max Difference Pair](divide_and_conquer/max_difference_pair.py)
* [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py)
* [Mergesort](divide_and_conquer/mergesort.py)
* [Peak](divide_and_conquer/peak.py)
* [Power](divide_and_conquer/power.py)
* [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py)
## Dynamic Programming
* [Abbreviation](dynamic_programming/abbreviation.py)
* [All Construct](dynamic_programming/all_construct.py)
* [Bitmask](dynamic_programming/bitmask.py)
* [Catalan Numbers](dynamic_programming/catalan_numbers.py)
* [Climbing Stairs](dynamic_programming/climbing_stairs.py)
* [Edit Distance](dynamic_programming/edit_distance.py)
* [Factorial](dynamic_programming/factorial.py)
* [Fast Fibonacci](dynamic_programming/fast_fibonacci.py)
* [Fibonacci](dynamic_programming/fibonacci.py)
* [Floyd Warshall](dynamic_programming/floyd_warshall.py)
* [Fractional Knapsack](dynamic_programming/fractional_knapsack.py)
* [Fractional Knapsack 2](dynamic_programming/fractional_knapsack_2.py)
* [Integer Partition](dynamic_programming/integer_partition.py)
* [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py)
* [Knapsack](dynamic_programming/knapsack.py)
* [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py)
* [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py)
* [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py)
* [Longest Sub Array](dynamic_programming/longest_sub_array.py)
* [Matrix Chain Order](dynamic_programming/matrix_chain_order.py)
* [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py)
* [Max Sub Array](dynamic_programming/max_sub_array.py)
* [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py)
* [Minimum Coin Change](dynamic_programming/minimum_coin_change.py)
* [Minimum Cost Path](dynamic_programming/minimum_cost_path.py)
* [Minimum Partition](dynamic_programming/minimum_partition.py)
* [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py)
* [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py)
* [Rod Cutting](dynamic_programming/rod_cutting.py)
* [Subset Generation](dynamic_programming/subset_generation.py)
* [Sum Of Subset](dynamic_programming/sum_of_subset.py)
## Electronics
* [Carrier Concentration](electronics/carrier_concentration.py)
* [Coulombs Law](electronics/coulombs_law.py)
* [Electric Power](electronics/electric_power.py)
* [Ohms Law](electronics/ohms_law.py)
## File Transfer
* [Receive File](file_transfer/receive_file.py)
* [Send File](file_transfer/send_file.py)
* Tests
* [Test Send File](file_transfer/tests/test_send_file.py)
## Financial
* [Equated Monthly Installments](financial/equated_monthly_installments.py)
* [Interest](financial/interest.py)
## Fractals
* [Julia Sets](fractals/julia_sets.py)
* [Koch Snowflake](fractals/koch_snowflake.py)
* [Mandelbrot](fractals/mandelbrot.py)
* [Sierpinski Triangle](fractals/sierpinski_triangle.py)
## Fuzzy Logic
* [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py)
## Genetic Algorithm
* [Basic String](genetic_algorithm/basic_string.py)
## Geodesy
* [Haversine Distance](geodesy/haversine_distance.py)
* [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py)
## Graphics
* [Bezier Curve](graphics/bezier_curve.py)
* [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py)
## Graphs
* [A Star](graphs/a_star.py)
* [Articulation Points](graphs/articulation_points.py)
* [Basic Graphs](graphs/basic_graphs.py)
* [Bellman Ford](graphs/bellman_ford.py)
* [Bfs Shortest Path](graphs/bfs_shortest_path.py)
* [Bfs Zero One Shortest Path](graphs/bfs_zero_one_shortest_path.py)
* [Bidirectional A Star](graphs/bidirectional_a_star.py)
* [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py)
* [Boruvka](graphs/boruvka.py)
* [Breadth First Search](graphs/breadth_first_search.py)
* [Breadth First Search 2](graphs/breadth_first_search_2.py)
* [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py)
* [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py)
* [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py)
* [Check Cycle](graphs/check_cycle.py)
* [Connected Components](graphs/connected_components.py)
* [Depth First Search](graphs/depth_first_search.py)
* [Depth First Search 2](graphs/depth_first_search_2.py)
* [Dijkstra](graphs/dijkstra.py)
* [Dijkstra 2](graphs/dijkstra_2.py)
* [Dijkstra Algorithm](graphs/dijkstra_algorithm.py)
* [Dinic](graphs/dinic.py)
* [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py)
* [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py)
* [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py)
* [Even Tree](graphs/even_tree.py)
* [Finding Bridges](graphs/finding_bridges.py)
* [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py)
* [G Topological Sort](graphs/g_topological_sort.py)
* [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py)
* [Graph List](graphs/graph_list.py)
* [Graph Matrix](graphs/graph_matrix.py)
* [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py)
* [Greedy Best First](graphs/greedy_best_first.py)
* [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py)
* [Kahns Algorithm Long](graphs/kahns_algorithm_long.py)
* [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py)
* [Karger](graphs/karger.py)
* [Markov Chain](graphs/markov_chain.py)
* [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py)
* [Minimum Path Sum](graphs/minimum_path_sum.py)
* [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py)
* [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py)
* [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py)
* [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py)
* [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py)
* [Multi Heuristic Astar](graphs/multi_heuristic_astar.py)
* [Page Rank](graphs/page_rank.py)
* [Prim](graphs/prim.py)
* [Random Graph Generator](graphs/random_graph_generator.py)
* [Scc Kosaraju](graphs/scc_kosaraju.py)
* [Strongly Connected Components](graphs/strongly_connected_components.py)
* [Tarjans Scc](graphs/tarjans_scc.py)
* Tests
* [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py)
* [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py)
## Greedy Methods
* [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py)
## Hashes
* [Adler32](hashes/adler32.py)
* [Chaos Machine](hashes/chaos_machine.py)
* [Djb2](hashes/djb2.py)
* [Enigma Machine](hashes/enigma_machine.py)
* [Hamming Code](hashes/hamming_code.py)
* [Luhn](hashes/luhn.py)
* [Md5](hashes/md5.py)
* [Sdbm](hashes/sdbm.py)
* [Sha1](hashes/sha1.py)
* [Sha256](hashes/sha256.py)
## Knapsack
* [Greedy Knapsack](knapsack/greedy_knapsack.py)
* [Knapsack](knapsack/knapsack.py)
* Tests
* [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py)
* [Test Knapsack](knapsack/tests/test_knapsack.py)
## Linear Algebra
* Src
* [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py)
* [Lib](linear_algebra/src/lib.py)
* [Polynom For Points](linear_algebra/src/polynom_for_points.py)
* [Power Iteration](linear_algebra/src/power_iteration.py)
* [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py)
* [Schur Complement](linear_algebra/src/schur_complement.py)
* [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py)
* [Transformations 2D](linear_algebra/src/transformations_2d.py)
## Machine Learning
* [Astar](machine_learning/astar.py)
* [Data Transformations](machine_learning/data_transformations.py)
* [Decision Tree](machine_learning/decision_tree.py)
* Forecasting
* [Run](machine_learning/forecasting/run.py)
* [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py)
* [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py)
* [Gradient Descent](machine_learning/gradient_descent.py)
* [K Means Clust](machine_learning/k_means_clust.py)
* [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py)
* [Knn Sklearn](machine_learning/knn_sklearn.py)
* [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py)
* [Linear Regression](machine_learning/linear_regression.py)
* Local Weighted Learning
* [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py)
* [Logistic Regression](machine_learning/logistic_regression.py)
* Lstm
* [Lstm Prediction](machine_learning/lstm/lstm_prediction.py)
* [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py)
* [Polymonial Regression](machine_learning/polymonial_regression.py)
* [Random Forest Classifier](machine_learning/random_forest_classifier.py)
* [Random Forest Regressor](machine_learning/random_forest_regressor.py)
* [Scoring Functions](machine_learning/scoring_functions.py)
* [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py)
* [Similarity Search](machine_learning/similarity_search.py)
* [Word Frequency Functions](machine_learning/word_frequency_functions.py)
## Maths
* [3N Plus 1](maths/3n_plus_1.py)
* [Abs](maths/abs.py)
* [Abs Max](maths/abs_max.py)
* [Abs Min](maths/abs_min.py)
* [Add](maths/add.py)
* [Aliquot Sum](maths/aliquot_sum.py)
* [Allocation Number](maths/allocation_number.py)
* [Area](maths/area.py)
* [Area Under Curve](maths/area_under_curve.py)
* [Armstrong Numbers](maths/armstrong_numbers.py)
* [Average Absolute Deviation](maths/average_absolute_deviation.py)
* [Average Mean](maths/average_mean.py)
* [Average Median](maths/average_median.py)
* [Average Mode](maths/average_mode.py)
* [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py)
* [Basic Maths](maths/basic_maths.py)
* [Binary Exp Mod](maths/binary_exp_mod.py)
* [Binary Exponentiation](maths/binary_exponentiation.py)
* [Binary Exponentiation 2](maths/binary_exponentiation_2.py)
* [Binary Exponentiation 3](maths/binary_exponentiation_3.py)
* [Binomial Coefficient](maths/binomial_coefficient.py)
* [Binomial Distribution](maths/binomial_distribution.py)
* [Bisection](maths/bisection.py)
* [Ceil](maths/ceil.py)
* [Check Polygon](maths/check_polygon.py)
* [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py)
* [Collatz Sequence](maths/collatz_sequence.py)
* [Combinations](maths/combinations.py)
* [Decimal Isolate](maths/decimal_isolate.py)
* [Double Factorial Iterative](maths/double_factorial_iterative.py)
* [Double Factorial Recursive](maths/double_factorial_recursive.py)
* [Entropy](maths/entropy.py)
* [Euclidean Distance](maths/euclidean_distance.py)
* [Euclidean Gcd](maths/euclidean_gcd.py)
* [Euler Method](maths/euler_method.py)
* [Euler Modified](maths/euler_modified.py)
* [Eulers Totient](maths/eulers_totient.py)
* [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py)
* [Factorial Iterative](maths/factorial_iterative.py)
* [Factorial Recursive](maths/factorial_recursive.py)
* [Factors](maths/factors.py)
* [Fermat Little Theorem](maths/fermat_little_theorem.py)
* [Fibonacci](maths/fibonacci.py)
* [Find Max](maths/find_max.py)
* [Find Max Recursion](maths/find_max_recursion.py)
* [Find Min](maths/find_min.py)
* [Find Min Recursion](maths/find_min_recursion.py)
* [Floor](maths/floor.py)
* [Gamma](maths/gamma.py)
* [Gamma Recursive](maths/gamma_recursive.py)
* [Gaussian](maths/gaussian.py)
* [Greatest Common Divisor](maths/greatest_common_divisor.py)
* [Greedy Coin Change](maths/greedy_coin_change.py)
* [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py)
* [Integration By Simpson Approx](maths/integration_by_simpson_approx.py)
* [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py)
* [Is Square Free](maths/is_square_free.py)
* [Jaccard Similarity](maths/jaccard_similarity.py)
* [Kadanes](maths/kadanes.py)
* [Karatsuba](maths/karatsuba.py)
* [Krishnamurthy Number](maths/krishnamurthy_number.py)
* [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py)
* [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py)
* [Largest Subarray Sum](maths/largest_subarray_sum.py)
* [Least Common Multiple](maths/least_common_multiple.py)
* [Line Length](maths/line_length.py)
* [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py)
* [Lucas Series](maths/lucas_series.py)
* [Matrix Exponentiation](maths/matrix_exponentiation.py)
* [Max Sum Sliding Window](maths/max_sum_sliding_window.py)
* [Median Of Two Arrays](maths/median_of_two_arrays.py)
* [Miller Rabin](maths/miller_rabin.py)
* [Mobius Function](maths/mobius_function.py)
* [Modular Exponential](maths/modular_exponential.py)
* [Monte Carlo](maths/monte_carlo.py)
* [Monte Carlo Dice](maths/monte_carlo_dice.py)
* [Nevilles Method](maths/nevilles_method.py)
* [Newton Raphson](maths/newton_raphson.py)
* [Number Of Digits](maths/number_of_digits.py)
* [Numerical Integration](maths/numerical_integration.py)
* [Perfect Cube](maths/perfect_cube.py)
* [Perfect Number](maths/perfect_number.py)
* [Perfect Square](maths/perfect_square.py)
* [Persistence](maths/persistence.py)
* [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py)
* [Points Are Collinear 3D](maths/points_are_collinear_3d.py)
* [Pollard Rho](maths/pollard_rho.py)
* [Polynomial Evaluation](maths/polynomial_evaluation.py)
* [Power Using Recursion](maths/power_using_recursion.py)
* [Prime Check](maths/prime_check.py)
* [Prime Factors](maths/prime_factors.py)
* [Prime Numbers](maths/prime_numbers.py)
* [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py)
* [Primelib](maths/primelib.py)
* [Proth Number](maths/proth_number.py)
* [Pythagoras](maths/pythagoras.py)
* [Qr Decomposition](maths/qr_decomposition.py)
* [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py)
* [Radians](maths/radians.py)
* [Radix2 Fft](maths/radix2_fft.py)
* [Relu](maths/relu.py)
* [Runge Kutta](maths/runge_kutta.py)
* [Segmented Sieve](maths/segmented_sieve.py)
* Series
* [Arithmetic](maths/series/arithmetic.py)
* [Geometric](maths/series/geometric.py)
* [Geometric Series](maths/series/geometric_series.py)
* [Harmonic](maths/series/harmonic.py)
* [Harmonic Series](maths/series/harmonic_series.py)
* [Hexagonal Numbers](maths/series/hexagonal_numbers.py)
* [P Series](maths/series/p_series.py)
* [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py)
* [Sigmoid](maths/sigmoid.py)
* [Simpson Rule](maths/simpson_rule.py)
* [Sin](maths/sin.py)
* [Sock Merchant](maths/sock_merchant.py)
* [Softmax](maths/softmax.py)
* [Square Root](maths/square_root.py)
* [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py)
* [Sum Of Digits](maths/sum_of_digits.py)
* [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py)
* [Sylvester Sequence](maths/sylvester_sequence.py)
* [Test Prime Check](maths/test_prime_check.py)
* [Trapezoidal Rule](maths/trapezoidal_rule.py)
* [Triplet Sum](maths/triplet_sum.py)
* [Two Pointer](maths/two_pointer.py)
* [Two Sum](maths/two_sum.py)
* [Ugly Numbers](maths/ugly_numbers.py)
* [Volume](maths/volume.py)
* [Zellers Congruence](maths/zellers_congruence.py)
## Matrix
* [Count Islands In Matrix](matrix/count_islands_in_matrix.py)
* [Inverse Of Matrix](matrix/inverse_of_matrix.py)
* [Matrix Class](matrix/matrix_class.py)
* [Matrix Operation](matrix/matrix_operation.py)
* [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py)
* [Rotate Matrix](matrix/rotate_matrix.py)
* [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py)
* [Sherman Morrison](matrix/sherman_morrison.py)
* [Spiral Print](matrix/spiral_print.py)
* Tests
* [Test Matrix Operation](matrix/tests/test_matrix_operation.py)
## Networking Flow
* [Ford Fulkerson](networking_flow/ford_fulkerson.py)
* [Minimum Cut](networking_flow/minimum_cut.py)
## Neural Network
* [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py)
* [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py)
* [Convolution Neural Network](neural_network/convolution_neural_network.py)
* [Perceptron](neural_network/perceptron.py)
## Other
* [Activity Selection](other/activity_selection.py)
* [Alternative List Arrange](other/alternative_list_arrange.py)
* [Check Strong Password](other/check_strong_password.py)
* [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py)
* [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py)
* [Doomsday](other/doomsday.py)
* [Fischer Yates Shuffle](other/fischer_yates_shuffle.py)
* [Gauss Easter](other/gauss_easter.py)
* [Graham Scan](other/graham_scan.py)
* [Greedy](other/greedy.py)
* [Least Recently Used](other/least_recently_used.py)
* [Lfu Cache](other/lfu_cache.py)
* [Linear Congruential Generator](other/linear_congruential_generator.py)
* [Lru Cache](other/lru_cache.py)
* [Magicdiamondpattern](other/magicdiamondpattern.py)
* [Nested Brackets](other/nested_brackets.py)
* [Password Generator](other/password_generator.py)
* [Scoring Algorithm](other/scoring_algorithm.py)
* [Sdes](other/sdes.py)
* [Tower Of Hanoi](other/tower_of_hanoi.py)
## Physics
* [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py)
* [Lorenz Transformation Four Vector](physics/lorenz_transformation_four_vector.py)
* [N Body Simulation](physics/n_body_simulation.py)
* [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py)
## Project Euler
* Problem 001
* [Sol1](project_euler/problem_001/sol1.py)
* [Sol2](project_euler/problem_001/sol2.py)
* [Sol3](project_euler/problem_001/sol3.py)
* [Sol4](project_euler/problem_001/sol4.py)
* [Sol5](project_euler/problem_001/sol5.py)
* [Sol6](project_euler/problem_001/sol6.py)
* [Sol7](project_euler/problem_001/sol7.py)
* Problem 002
* [Sol1](project_euler/problem_002/sol1.py)
* [Sol2](project_euler/problem_002/sol2.py)
* [Sol3](project_euler/problem_002/sol3.py)
* [Sol4](project_euler/problem_002/sol4.py)
* [Sol5](project_euler/problem_002/sol5.py)
* Problem 003
* [Sol1](project_euler/problem_003/sol1.py)
* [Sol2](project_euler/problem_003/sol2.py)
* [Sol3](project_euler/problem_003/sol3.py)
* Problem 004
* [Sol1](project_euler/problem_004/sol1.py)
* [Sol2](project_euler/problem_004/sol2.py)
* Problem 005
* [Sol1](project_euler/problem_005/sol1.py)
* [Sol2](project_euler/problem_005/sol2.py)
* Problem 006
* [Sol1](project_euler/problem_006/sol1.py)
* [Sol2](project_euler/problem_006/sol2.py)
* [Sol3](project_euler/problem_006/sol3.py)
* [Sol4](project_euler/problem_006/sol4.py)
* Problem 007
* [Sol1](project_euler/problem_007/sol1.py)
* [Sol2](project_euler/problem_007/sol2.py)
* [Sol3](project_euler/problem_007/sol3.py)
* Problem 008
* [Sol1](project_euler/problem_008/sol1.py)
* [Sol2](project_euler/problem_008/sol2.py)
* [Sol3](project_euler/problem_008/sol3.py)
* Problem 009
* [Sol1](project_euler/problem_009/sol1.py)
* [Sol2](project_euler/problem_009/sol2.py)
* [Sol3](project_euler/problem_009/sol3.py)
* Problem 010
* [Sol1](project_euler/problem_010/sol1.py)
* [Sol2](project_euler/problem_010/sol2.py)
* [Sol3](project_euler/problem_010/sol3.py)
* Problem 011
* [Sol1](project_euler/problem_011/sol1.py)
* [Sol2](project_euler/problem_011/sol2.py)
* Problem 012
* [Sol1](project_euler/problem_012/sol1.py)
* [Sol2](project_euler/problem_012/sol2.py)
* Problem 013
* [Sol1](project_euler/problem_013/sol1.py)
* Problem 014
* [Sol1](project_euler/problem_014/sol1.py)
* [Sol2](project_euler/problem_014/sol2.py)
* Problem 015
* [Sol1](project_euler/problem_015/sol1.py)
* Problem 016
* [Sol1](project_euler/problem_016/sol1.py)
* [Sol2](project_euler/problem_016/sol2.py)
* Problem 017
* [Sol1](project_euler/problem_017/sol1.py)
* Problem 018
* [Solution](project_euler/problem_018/solution.py)
* Problem 019
* [Sol1](project_euler/problem_019/sol1.py)
* Problem 020
* [Sol1](project_euler/problem_020/sol1.py)
* [Sol2](project_euler/problem_020/sol2.py)
* [Sol3](project_euler/problem_020/sol3.py)
* [Sol4](project_euler/problem_020/sol4.py)
* Problem 021
* [Sol1](project_euler/problem_021/sol1.py)
* Problem 022
* [Sol1](project_euler/problem_022/sol1.py)
* [Sol2](project_euler/problem_022/sol2.py)
* Problem 023
* [Sol1](project_euler/problem_023/sol1.py)
* Problem 024
* [Sol1](project_euler/problem_024/sol1.py)
* Problem 025
* [Sol1](project_euler/problem_025/sol1.py)
* [Sol2](project_euler/problem_025/sol2.py)
* [Sol3](project_euler/problem_025/sol3.py)
* Problem 026
* [Sol1](project_euler/problem_026/sol1.py)
* Problem 027
* [Sol1](project_euler/problem_027/sol1.py)
* Problem 028
* [Sol1](project_euler/problem_028/sol1.py)
* Problem 029
* [Sol1](project_euler/problem_029/sol1.py)
* Problem 030
* [Sol1](project_euler/problem_030/sol1.py)
* Problem 031
* [Sol1](project_euler/problem_031/sol1.py)
* [Sol2](project_euler/problem_031/sol2.py)
* Problem 032
* [Sol32](project_euler/problem_032/sol32.py)
* Problem 033
* [Sol1](project_euler/problem_033/sol1.py)
* Problem 034
* [Sol1](project_euler/problem_034/sol1.py)
* Problem 035
* [Sol1](project_euler/problem_035/sol1.py)
* Problem 036
* [Sol1](project_euler/problem_036/sol1.py)
* Problem 037
* [Sol1](project_euler/problem_037/sol1.py)
* Problem 038
* [Sol1](project_euler/problem_038/sol1.py)
* Problem 039
* [Sol1](project_euler/problem_039/sol1.py)
* Problem 040
* [Sol1](project_euler/problem_040/sol1.py)
* Problem 041
* [Sol1](project_euler/problem_041/sol1.py)
* Problem 042
* [Solution42](project_euler/problem_042/solution42.py)
* Problem 043
* [Sol1](project_euler/problem_043/sol1.py)
* Problem 044
* [Sol1](project_euler/problem_044/sol1.py)
* Problem 045
* [Sol1](project_euler/problem_045/sol1.py)
* Problem 046
* [Sol1](project_euler/problem_046/sol1.py)
* Problem 047
* [Sol1](project_euler/problem_047/sol1.py)
* Problem 048
* [Sol1](project_euler/problem_048/sol1.py)
* Problem 049
* [Sol1](project_euler/problem_049/sol1.py)
* Problem 050
* [Sol1](project_euler/problem_050/sol1.py)
* Problem 051
* [Sol1](project_euler/problem_051/sol1.py)
* Problem 052
* [Sol1](project_euler/problem_052/sol1.py)
* Problem 053
* [Sol1](project_euler/problem_053/sol1.py)
* Problem 054
* [Sol1](project_euler/problem_054/sol1.py)
* [Test Poker Hand](project_euler/problem_054/test_poker_hand.py)
* Problem 055
* [Sol1](project_euler/problem_055/sol1.py)
* Problem 056
* [Sol1](project_euler/problem_056/sol1.py)
* Problem 057
* [Sol1](project_euler/problem_057/sol1.py)
* Problem 058
* [Sol1](project_euler/problem_058/sol1.py)
* Problem 059
* [Sol1](project_euler/problem_059/sol1.py)
* Problem 062
* [Sol1](project_euler/problem_062/sol1.py)
* Problem 063
* [Sol1](project_euler/problem_063/sol1.py)
* Problem 064
* [Sol1](project_euler/problem_064/sol1.py)
* Problem 065
* [Sol1](project_euler/problem_065/sol1.py)
* Problem 067
* [Sol1](project_euler/problem_067/sol1.py)
* [Sol2](project_euler/problem_067/sol2.py)
* Problem 068
* [Sol1](project_euler/problem_068/sol1.py)
* Problem 069
* [Sol1](project_euler/problem_069/sol1.py)
* Problem 070
* [Sol1](project_euler/problem_070/sol1.py)
* Problem 071
* [Sol1](project_euler/problem_071/sol1.py)
* Problem 072
* [Sol1](project_euler/problem_072/sol1.py)
* [Sol2](project_euler/problem_072/sol2.py)
* Problem 074
* [Sol1](project_euler/problem_074/sol1.py)
* [Sol2](project_euler/problem_074/sol2.py)
* Problem 075
* [Sol1](project_euler/problem_075/sol1.py)
* Problem 076
* [Sol1](project_euler/problem_076/sol1.py)
* Problem 077
* [Sol1](project_euler/problem_077/sol1.py)
* Problem 078
* [Sol1](project_euler/problem_078/sol1.py)
* Problem 080
* [Sol1](project_euler/problem_080/sol1.py)
* Problem 081
* [Sol1](project_euler/problem_081/sol1.py)
* Problem 085
* [Sol1](project_euler/problem_085/sol1.py)
* Problem 086
* [Sol1](project_euler/problem_086/sol1.py)
* Problem 087
* [Sol1](project_euler/problem_087/sol1.py)
* Problem 089
* [Sol1](project_euler/problem_089/sol1.py)
* Problem 091
* [Sol1](project_euler/problem_091/sol1.py)
* Problem 092
* [Sol1](project_euler/problem_092/sol1.py)
* Problem 097
* [Sol1](project_euler/problem_097/sol1.py)
* Problem 099
* [Sol1](project_euler/problem_099/sol1.py)
* Problem 101
* [Sol1](project_euler/problem_101/sol1.py)
* Problem 102
* [Sol1](project_euler/problem_102/sol1.py)
* Problem 104
* [Sol](project_euler/problem_104/sol.py)
* Problem 107
* [Sol1](project_euler/problem_107/sol1.py)
* Problem 109
* [Sol1](project_euler/problem_109/sol1.py)
* Problem 112
* [Sol1](project_euler/problem_112/sol1.py)
* Problem 113
* [Sol1](project_euler/problem_113/sol1.py)
* Problem 119
* [Sol1](project_euler/problem_119/sol1.py)
* Problem 120
* [Sol1](project_euler/problem_120/sol1.py)
* Problem 121
* [Sol1](project_euler/problem_121/sol1.py)
* Problem 123
* [Sol1](project_euler/problem_123/sol1.py)
* Problem 125
* [Sol1](project_euler/problem_125/sol1.py)
* Problem 129
* [Sol1](project_euler/problem_129/sol1.py)
* Problem 135
* [Sol1](project_euler/problem_135/sol1.py)
* Problem 144
* [Sol1](project_euler/problem_144/sol1.py)
* Problem 145
* [Sol1](project_euler/problem_145/sol1.py)
* Problem 173
* [Sol1](project_euler/problem_173/sol1.py)
* Problem 174
* [Sol1](project_euler/problem_174/sol1.py)
* Problem 180
* [Sol1](project_euler/problem_180/sol1.py)
* Problem 188
* [Sol1](project_euler/problem_188/sol1.py)
* Problem 191
* [Sol1](project_euler/problem_191/sol1.py)
* Problem 203
* [Sol1](project_euler/problem_203/sol1.py)
* Problem 205
* [Sol1](project_euler/problem_205/sol1.py)
* Problem 206
* [Sol1](project_euler/problem_206/sol1.py)
* Problem 207
* [Sol1](project_euler/problem_207/sol1.py)
* Problem 234
* [Sol1](project_euler/problem_234/sol1.py)
* Problem 301
* [Sol1](project_euler/problem_301/sol1.py)
* Problem 493
* [Sol1](project_euler/problem_493/sol1.py)
* Problem 551
* [Sol1](project_euler/problem_551/sol1.py)
* Problem 686
* [Sol1](project_euler/problem_686/sol1.py)
## Quantum
* [Deutsch Jozsa](quantum/deutsch_jozsa.py)
* [Half Adder](quantum/half_adder.py)
* [Not Gate](quantum/not_gate.py)
* [Quantum Entanglement](quantum/quantum_entanglement.py)
* [Ripple Adder Classic](quantum/ripple_adder_classic.py)
* [Single Qubit Measure](quantum/single_qubit_measure.py)
## Scheduling
* [First Come First Served](scheduling/first_come_first_served.py)
* [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py)
* [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py)
* [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py)
* [Round Robin](scheduling/round_robin.py)
* [Shortest Job First](scheduling/shortest_job_first.py)
## Searches
* [Binary Search](searches/binary_search.py)
* [Binary Tree Traversal](searches/binary_tree_traversal.py)
* [Double Linear Search](searches/double_linear_search.py)
* [Double Linear Search Recursion](searches/double_linear_search_recursion.py)
* [Fibonacci Search](searches/fibonacci_search.py)
* [Hill Climbing](searches/hill_climbing.py)
* [Interpolation Search](searches/interpolation_search.py)
* [Jump Search](searches/jump_search.py)
* [Linear Search](searches/linear_search.py)
* [Quick Select](searches/quick_select.py)
* [Sentinel Linear Search](searches/sentinel_linear_search.py)
* [Simple Binary Search](searches/simple_binary_search.py)
* [Simulated Annealing](searches/simulated_annealing.py)
* [Tabu Search](searches/tabu_search.py)
* [Ternary Search](searches/ternary_search.py)
## Sorts
* [Bead Sort](sorts/bead_sort.py)
* [Bitonic Sort](sorts/bitonic_sort.py)
* [Bogo Sort](sorts/bogo_sort.py)
* [Bubble Sort](sorts/bubble_sort.py)
* [Bucket Sort](sorts/bucket_sort.py)
* [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py)
* [Comb Sort](sorts/comb_sort.py)
* [Counting Sort](sorts/counting_sort.py)
* [Cycle Sort](sorts/cycle_sort.py)
* [Double Sort](sorts/double_sort.py)
* [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py)
* [Exchange Sort](sorts/exchange_sort.py)
* [External Sort](sorts/external_sort.py)
* [Gnome Sort](sorts/gnome_sort.py)
* [Heap Sort](sorts/heap_sort.py)
* [Insertion Sort](sorts/insertion_sort.py)
* [Intro Sort](sorts/intro_sort.py)
* [Iterative Merge Sort](sorts/iterative_merge_sort.py)
* [Merge Insertion Sort](sorts/merge_insertion_sort.py)
* [Merge Sort](sorts/merge_sort.py)
* [Msd Radix Sort](sorts/msd_radix_sort.py)
* [Natural Sort](sorts/natural_sort.py)
* [Odd Even Sort](sorts/odd_even_sort.py)
* [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py)
* [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py)
* [Pancake Sort](sorts/pancake_sort.py)
* [Patience Sort](sorts/patience_sort.py)
* [Pigeon Sort](sorts/pigeon_sort.py)
* [Pigeonhole Sort](sorts/pigeonhole_sort.py)
* [Quick Sort](sorts/quick_sort.py)
* [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py)
* [Radix Sort](sorts/radix_sort.py)
* [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py)
* [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py)
* [Recursive Bubble Sort](sorts/recursive_bubble_sort.py)
* [Recursive Insertion Sort](sorts/recursive_insertion_sort.py)
* [Recursive Mergesort Array](sorts/recursive_mergesort_array.py)
* [Recursive Quick Sort](sorts/recursive_quick_sort.py)
* [Selection Sort](sorts/selection_sort.py)
* [Shell Sort](sorts/shell_sort.py)
* [Slowsort](sorts/slowsort.py)
* [Stooge Sort](sorts/stooge_sort.py)
* [Strand Sort](sorts/strand_sort.py)
* [Tim Sort](sorts/tim_sort.py)
* [Topological Sort](sorts/topological_sort.py)
* [Tree Sort](sorts/tree_sort.py)
* [Unknown Sort](sorts/unknown_sort.py)
* [Wiggle Sort](sorts/wiggle_sort.py)
## Strings
* [Aho Corasick](strings/aho_corasick.py)
* [Alternative String Arrange](strings/alternative_string_arrange.py)
* [Anagrams](strings/anagrams.py)
* [Autocomplete Using Trie](strings/autocomplete_using_trie.py)
* [Boyer Moore Search](strings/boyer_moore_search.py)
* [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py)
* [Capitalize](strings/capitalize.py)
* [Check Anagrams](strings/check_anagrams.py)
* [Check Pangram](strings/check_pangram.py)
* [Credit Card Validator](strings/credit_card_validator.py)
* [Detecting English Programmatically](strings/detecting_english_programmatically.py)
* [Frequency Finder](strings/frequency_finder.py)
* [Hamming Distance](strings/hamming_distance.py)
* [Indian Phone Validator](strings/indian_phone_validator.py)
* [Is Contains Unique Chars](strings/is_contains_unique_chars.py)
* [Is Palindrome](strings/is_palindrome.py)
* [Jaro Winkler](strings/jaro_winkler.py)
* [Join](strings/join.py)
* [Knuth Morris Pratt](strings/knuth_morris_pratt.py)
* [Levenshtein Distance](strings/levenshtein_distance.py)
* [Lower](strings/lower.py)
* [Manacher](strings/manacher.py)
* [Min Cost String Conversion](strings/min_cost_string_conversion.py)
* [Naive String Search](strings/naive_string_search.py)
* [Ngram](strings/ngram.py)
* [Palindrome](strings/palindrome.py)
* [Prefix Function](strings/prefix_function.py)
* [Rabin Karp](strings/rabin_karp.py)
* [Remove Duplicate](strings/remove_duplicate.py)
* [Reverse Letters](strings/reverse_letters.py)
* [Reverse Long Words](strings/reverse_long_words.py)
* [Reverse Words](strings/reverse_words.py)
* [Split](strings/split.py)
* [Upper](strings/upper.py)
* [Wave](strings/wave.py)
* [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py)
* [Word Occurrence](strings/word_occurrence.py)
* [Word Patterns](strings/word_patterns.py)
* [Z Function](strings/z_function.py)
## Web Programming
* [Co2 Emission](web_programming/co2_emission.py)
* [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py)
* [Crawl Google Results](web_programming/crawl_google_results.py)
* [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py)
* [Currency Converter](web_programming/currency_converter.py)
* [Current Stock Price](web_programming/current_stock_price.py)
* [Current Weather](web_programming/current_weather.py)
* [Daily Horoscope](web_programming/daily_horoscope.py)
* [Download Images From Google Query](web_programming/download_images_from_google_query.py)
* [Emails From Url](web_programming/emails_from_url.py)
* [Fetch Anime And Play](web_programming/fetch_anime_and_play.py)
* [Fetch Bbc News](web_programming/fetch_bbc_news.py)
* [Fetch Github Info](web_programming/fetch_github_info.py)
* [Fetch Jobs](web_programming/fetch_jobs.py)
* [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py)
* [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py)
* [Get Imdbtop](web_programming/get_imdbtop.py)
* [Get Top Hn Posts](web_programming/get_top_hn_posts.py)
* [Get User Tweets](web_programming/get_user_tweets.py)
* [Giphy](web_programming/giphy.py)
* [Instagram Crawler](web_programming/instagram_crawler.py)
* [Instagram Pic](web_programming/instagram_pic.py)
* [Instagram Video](web_programming/instagram_video.py)
* [Nasa Data](web_programming/nasa_data.py)
* [Random Anime Character](web_programming/random_anime_character.py)
* [Recaptcha Verification](web_programming/recaptcha_verification.py)
* [Reddit](web_programming/reddit.py)
* [Search Books By Isbn](web_programming/search_books_by_isbn.py)
* [Slack Message](web_programming/slack_message.py)
* [Test Fetch Github Info](web_programming/test_fetch_github_info.py)
* [World Covid19 Stats](web_programming/world_covid19_stats.py)
| 1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Backtracking
Backtracking is a way to speed up the search process by removing candidates when they can't be the solution of a problem.
* <https://en.wikipedia.org/wiki/Backtracking>
* <https://en.wikipedia.org/wiki/Decision_tree_pruning>
* <https://medium.com/@priyankmistry1999/backtracking-sudoku-6e4439e4825c>
* <https://www.geeksforgeeks.org/sudoku-backtracking-7/>
| # Backtracking
Backtracking is a way to speed up the search process by removing candidates when they can't be the solution of a problem.
* <https://en.wikipedia.org/wiki/Backtracking>
* <https://en.wikipedia.org/wiki/Decision_tree_pruning>
* <https://medium.com/@priyankmistry1999/backtracking-sudoku-6e4439e4825c>
* <https://www.geeksforgeeks.org/sudoku-backtracking-7/>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Compression
Data compression is everywhere, you need it to store data without taking too much space.
Either the compression lose some data (then we talk about lossy compression, such as .jpg) or it does not (and then it is lossless compression, such as .png)
Lossless compression is mainly used for archive purpose as it allow storing data without losing information about the file archived. On the other hand, lossy compression is used for transfer of file where quality isn't necessarily what is required (i.e: images on Twitter).
* <https://www.sciencedirect.com/topics/computer-science/compression-algorithm>
* <https://en.wikipedia.org/wiki/Data_compression>
* <https://en.wikipedia.org/wiki/Pigeonhole_principle>
| # Compression
Data compression is everywhere, you need it to store data without taking too much space.
Either the compression lose some data (then we talk about lossy compression, such as .jpg) or it does not (and then it is lossless compression, such as .png)
Lossless compression is mainly used for archive purpose as it allow storing data without losing information about the file archived. On the other hand, lossy compression is used for transfer of file where quality isn't necessarily what is required (i.e: images on Twitter).
* <https://www.sciencedirect.com/topics/computer-science/compression-algorithm>
* <https://en.wikipedia.org/wiki/Data_compression>
* <https://en.wikipedia.org/wiki/Pigeonhole_principle>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| tasks:
- init: pip3 install -r ./requirements.txt
| tasks:
- init: pip3 install -r ./requirements.txt
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Linear algebra library for Python
This module contains classes and functions for doing linear algebra.
---
## Overview
### class Vector
-
- This class represents a vector of arbitrary size and related operations.
**Overview of the methods:**
- constructor(components) : init the vector
- set(components) : changes the vector components.
- \_\_str\_\_() : toString method
- component(i): gets the i-th component (0-indexed)
- \_\_len\_\_() : gets the size / length of the vector (number of components)
- euclidean_length() : returns the eulidean length of the vector
- operator + : vector addition
- operator - : vector subtraction
- operator * : scalar multiplication and dot product
- copy() : copies this vector and returns it
- change_component(pos,value) : changes the specified component
- function zero_vector(dimension)
- returns a zero vector of 'dimension'
- function unit_basis_vector(dimension, pos)
- returns a unit basis vector with a one at index 'pos' (0-indexed)
- function axpy(scalar, vector1, vector2)
- computes the axpy operation
- function random_vector(N, a, b)
- returns a random vector of size N, with random integer components between 'a' and 'b' inclusive
### class Matrix
-
- This class represents a matrix of arbitrary size and operations on it.
**Overview of the methods:**
- \_\_str\_\_() : returns a string representation
- operator * : implements the matrix vector multiplication
implements the matrix-scalar multiplication.
- change_component(x, y, value) : changes the specified component.
- component(x, y) : returns the specified component.
- width() : returns the width of the matrix
- height() : returns the height of the matrix
- determinant() : returns the determinant of the matrix if it is square
- operator + : implements the matrix-addition.
- operator - : implements the matrix-subtraction
- function square_zero_matrix(N)
- returns a square zero-matrix of dimension NxN
- function random_matrix(W, H, a, b)
- returns a random matrix WxH with integer components between 'a' and 'b' inclusive
---
## Documentation
This module uses docstrings to enable the use of Python's in-built `help(...)` function.
For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`.
---
## Usage
Import the module `lib.py` from the **src** directory into your project.
Alternatively, you can directly use the Python bytecode file `lib.pyc`.
---
## Tests
`src/tests.py` contains Python unit tests which can be run with `python3 -m unittest -v`.
| # Linear algebra library for Python
This module contains classes and functions for doing linear algebra.
---
## Overview
### class Vector
-
- This class represents a vector of arbitrary size and related operations.
**Overview of the methods:**
- constructor(components) : init the vector
- set(components) : changes the vector components.
- \_\_str\_\_() : toString method
- component(i): gets the i-th component (0-indexed)
- \_\_len\_\_() : gets the size / length of the vector (number of components)
- euclidean_length() : returns the eulidean length of the vector
- operator + : vector addition
- operator - : vector subtraction
- operator * : scalar multiplication and dot product
- copy() : copies this vector and returns it
- change_component(pos,value) : changes the specified component
- function zero_vector(dimension)
- returns a zero vector of 'dimension'
- function unit_basis_vector(dimension, pos)
- returns a unit basis vector with a one at index 'pos' (0-indexed)
- function axpy(scalar, vector1, vector2)
- computes the axpy operation
- function random_vector(N, a, b)
- returns a random vector of size N, with random integer components between 'a' and 'b' inclusive
### class Matrix
-
- This class represents a matrix of arbitrary size and operations on it.
**Overview of the methods:**
- \_\_str\_\_() : returns a string representation
- operator * : implements the matrix vector multiplication
implements the matrix-scalar multiplication.
- change_component(x, y, value) : changes the specified component.
- component(x, y) : returns the specified component.
- width() : returns the width of the matrix
- height() : returns the height of the matrix
- determinant() : returns the determinant of the matrix if it is square
- operator + : implements the matrix-addition.
- operator - : implements the matrix-subtraction
- function square_zero_matrix(N)
- returns a square zero-matrix of dimension NxN
- function random_matrix(W, H, a, b)
- returns a random matrix WxH with integer components between 'a' and 'b' inclusive
---
## Documentation
This module uses docstrings to enable the use of Python's in-built `help(...)` function.
For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`.
---
## Usage
Import the module `lib.py` from the **src** directory into your project.
Alternatively, you can directly use the Python bytecode file `lib.pyc`.
---
## Tests
`src/tests.py` contains Python unit tests which can be run with `python3 -m unittest -v`.
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Boolean Algebra
Boolean algebra is used to do arithmetic with bits of values True (1) or False (0).
There are three basic operations: 'and', 'or' and 'not'.
* <https://en.wikipedia.org/wiki/Boolean_algebra>
* <https://plato.stanford.edu/entries/boolalg-math/>
| # Boolean Algebra
Boolean algebra is used to do arithmetic with bits of values True (1) or False (0).
There are three basic operations: 'and', 'or' and 'not'.
* <https://en.wikipedia.org/wiki/Boolean_algebra>
* <https://plato.stanford.edu/entries/boolalg-math/>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Computer Vision
Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human does, and provide appropriate output.
It is like imparting human intelligence and instincts to a computer.
Image processing and computer vision are a little different from each other. Image processing means applying some algorithms for transforming image from one form to the other like smoothing, contrasting, stretching, etc.
While computer vision comes from modelling image processing using the techniques of machine learning, computer vision applies machine learning to recognize patterns for interpretation of images (much like the process of visual reasoning of human vision).
* <https://en.wikipedia.org/wiki/Computer_vision>
* <https://www.algorithmia.com/blog/introduction-to-computer-vision>
| # Computer Vision
Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human does, and provide appropriate output.
It is like imparting human intelligence and instincts to a computer.
Image processing and computer vision are a little different from each other. Image processing means applying some algorithms for transforming image from one form to the other like smoothing, contrasting, stretching, etc.
While computer vision comes from modelling image processing using the techniques of machine learning, computer vision applies machine learning to recognize patterns for interpretation of images (much like the process of visual reasoning of human vision).
* <https://en.wikipedia.org/wiki/Computer_vision>
* <https://www.algorithmia.com/blog/introduction-to-computer-vision>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Ciphers
Ciphers are used to protect data from people that are not allowed to have it. They are everywhere on the internet to protect your connections.
* <https://en.wikipedia.org/wiki/Cipher>
* <http://practicalcryptography.com/ciphers/>
* <https://practicalcryptography.com/ciphers/classical-era/>
| # Ciphers
Ciphers are used to protect data from people that are not allowed to have it. They are everywhere on the internet to protect your connections.
* <https://en.wikipedia.org/wiki/Cipher>
* <http://practicalcryptography.com/ciphers/>
* <https://practicalcryptography.com/ciphers/classical-era/>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Locally Weighted Linear Regression
It is a non-parametric ML algorithm that does not learn on a fixed set of parameters such as **linear regression**. \
So, here comes a question of what is *linear regression*? \
**Linear regression** is a supervised learning algorithm used for computing linear relationships between input (X) and output (Y). \
### Terminology Involved
number_of_features(i) = Number of features involved. \
number_of_training_examples(m) = Number of training examples. \
output_sequence(y) = Output Sequence. \
$\theta$ $^T$ x = predicted point. \
J($\theta$) = COst function of point.
The steps involved in ordinary linear regression are:
Training phase: Compute \theta to minimize the cost. \
J($\theta$) = $\sum_{i=1}^m$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$
Predict output: for given query point x, \
return: ($\theta$)$^T$ x
<img src="https://miro.medium.com/max/700/1*FZsLp8yTULf77qrp0Qd91g.png" alt="Linear Regression">
This training phase is possible when data points are linear, but there again comes a question can we predict non-linear relationship between x and y ? as shown below
<img src="https://miro.medium.com/max/700/1*DHYvJg55uN-Kj8jHaxDKvQ.png" alt="Non-linear Data">
<br />
<br />
So, here comes the role of non-parametric algorithm which doesn't compute predictions based on fixed set of params. Rather parameters $\theta$ are computed individually for each query point/data point x.
<br />
<br />
While Computing $\theta$ , a higher "preferance" is given to points in the vicinity of x than points farther from x.
Cost Function J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$
$w^i$ is non-negative weight associated to training point $x^i$. \
$w^i$ is large fr $x^i$'s lying closer to query point $x_i$. \
$w^i$ is small for $x^i$'s lying farther to query point $x_i$.
A Typical weight can be computed using \
$w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$)
Where $\tau$ is the bandwidth parameter that controls $w^i$ distance from x.
Let's look at a example :
Suppose, we had a query point x=5.0 and training points $x^1$=4.9 and $x^2$=5.0 than we can calculate weights as :
$w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$) with $\tau$=0.5
$w^1$ = $\exp$(-$\frac{(4.9-5)^2}{2(0.5)^2}$) = 0.9802
$w^2$ = $\exp$(-$\frac{(3-5)^2}{2(0.5)^2}$) = 0.000335
So, J($\theta$) = 0.9802*($\theta$ $^T$ $x^1$ - $y^1$) + 0.000335*($\theta$ $^T$ $x^2$ - $y^2$)
So, here by we can conclude that the weight fall exponentially as the distance between x & $x^i$ increases and So, does the contribution of error in prediction for $x^i$ to the cost.
Steps involved in LWL are : \
Compute \theta to minimize the cost.
J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ \
Predict Output: for given query point x, \
return : $\theta$ $^T$ x
<img src="https://miro.medium.com/max/700/1*H3QS05Q1GJtY-tiBL00iug.png" alt="LWL">
| # Locally Weighted Linear Regression
It is a non-parametric ML algorithm that does not learn on a fixed set of parameters such as **linear regression**. \
So, here comes a question of what is *linear regression*? \
**Linear regression** is a supervised learning algorithm used for computing linear relationships between input (X) and output (Y). \
### Terminology Involved
number_of_features(i) = Number of features involved. \
number_of_training_examples(m) = Number of training examples. \
output_sequence(y) = Output Sequence. \
$\theta$ $^T$ x = predicted point. \
J($\theta$) = COst function of point.
The steps involved in ordinary linear regression are:
Training phase: Compute \theta to minimize the cost. \
J($\theta$) = $\sum_{i=1}^m$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$
Predict output: for given query point x, \
return: ($\theta$)$^T$ x
<img src="https://miro.medium.com/max/700/1*FZsLp8yTULf77qrp0Qd91g.png" alt="Linear Regression">
This training phase is possible when data points are linear, but there again comes a question can we predict non-linear relationship between x and y ? as shown below
<img src="https://miro.medium.com/max/700/1*DHYvJg55uN-Kj8jHaxDKvQ.png" alt="Non-linear Data">
<br />
<br />
So, here comes the role of non-parametric algorithm which doesn't compute predictions based on fixed set of params. Rather parameters $\theta$ are computed individually for each query point/data point x.
<br />
<br />
While Computing $\theta$ , a higher "preferance" is given to points in the vicinity of x than points farther from x.
Cost Function J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$
$w^i$ is non-negative weight associated to training point $x^i$. \
$w^i$ is large fr $x^i$'s lying closer to query point $x_i$. \
$w^i$ is small for $x^i$'s lying farther to query point $x_i$.
A Typical weight can be computed using \
$w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$)
Where $\tau$ is the bandwidth parameter that controls $w^i$ distance from x.
Let's look at a example :
Suppose, we had a query point x=5.0 and training points $x^1$=4.9 and $x^2$=5.0 than we can calculate weights as :
$w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$) with $\tau$=0.5
$w^1$ = $\exp$(-$\frac{(4.9-5)^2}{2(0.5)^2}$) = 0.9802
$w^2$ = $\exp$(-$\frac{(3-5)^2}{2(0.5)^2}$) = 0.000335
So, J($\theta$) = 0.9802*($\theta$ $^T$ $x^1$ - $y^1$) + 0.000335*($\theta$ $^T$ $x^2$ - $y^2$)
So, here by we can conclude that the weight fall exponentially as the distance between x & $x^i$ increases and So, does the contribution of error in prediction for $x^i$ to the cost.
Steps involved in LWL are : \
Compute \theta to minimize the cost.
J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ \
Predict Output: for given query point x, \
return : $\theta$ $^T$ x
<img src="https://miro.medium.com/max/700/1*H3QS05Q1GJtY-tiBL00iug.png" alt="LWL">
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [ ] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [ ] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
### Checklist:
* [ ] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [ ] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| <div align="center">
<!-- Title: -->
<a href="https://github.com/TheAlgorithms/">
<img src="https://raw.githubusercontent.com/TheAlgorithms/website/1cd824df116b27029f17c2d1b42d81731f28a920/public/logo.svg" height="100">
</a>
<h1><a href="https://github.com/TheAlgorithms/">The Algorithms</a> - Python</h1>
<!-- Labels: -->
<!-- First row: -->
<a href="https://gitpod.io/#https://github.com/TheAlgorithms/Python">
<img src="https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod&style=flat-square" height="20" alt="Gitpod Ready-to-Code">
</a>
<a href="https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md">
<img src="https://img.shields.io/static/v1.svg?label=Contributions&message=Welcome&color=0059b3&style=flat-square" height="20" alt="Contributions Welcome">
</a>
<a href="https://www.paypal.me/TheAlgorithms/100">
<img src="https://img.shields.io/badge/Donate-PayPal-green.svg?logo=paypal&style=flat-square" height="20" alt="Donate">
</a>
<img src="https://img.shields.io/github/repo-size/TheAlgorithms/Python.svg?label=Repo%20size&style=flat-square" height="20">
<a href="https://discord.gg/c7MnfGFGa6">
<img src="https://img.shields.io/discord/808045925556682782.svg?logo=discord&colorB=7289DA&style=flat-square" height="20" alt="Discord chat">
</a>
<a href="https://gitter.im/TheAlgorithms">
<img src="https://img.shields.io/badge/Chat-Gitter-ff69b4.svg?label=Chat&logo=gitter&style=flat-square" height="20" alt="Gitter chat">
</a>
<!-- Second row: -->
<br>
<a href="https://github.com/TheAlgorithms/Python/actions">
<img src="https://img.shields.io/github/workflow/status/TheAlgorithms/Python/build?label=CI&logo=github&style=flat-square" height="20" alt="GitHub Workflow Status">
</a>
<a href="https://lgtm.com/projects/g/TheAlgorithms/Python/alerts">
<img src="https://img.shields.io/lgtm/alerts/github/TheAlgorithms/Python.svg?label=LGTM&logo=LGTM&style=flat-square" height="20" alt="LGTM">
</a>
<a href="https://github.com/pre-commit/pre-commit">
<img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" height="20" alt="pre-commit">
</a>
<a href="https://github.com/psf/black">
<img src="https://img.shields.io/static/v1?label=code%20style&message=black&color=black&style=flat-square" height="20" alt="code style: black">
</a>
<!-- Short description: -->
<h3>All algorithms implemented in Python - for education</h3>
</div>
Implementations are for learning purposes only. As they may be less efficient than the implementations in the Python standard library, use them at your discretion.
## Getting Started
Read through our [Contribution Guidelines](CONTRIBUTING.md) before you contribute.
## Community Channels
We're on [Discord](https://discord.gg/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms)! Community channels are great for you to ask questions and get help. Please join us!
## List of Algorithms
See our [directory](DIRECTORY.md) for easier navigation and better overview of the project.
| <div align="center">
<!-- Title: -->
<a href="https://github.com/TheAlgorithms/">
<img src="https://raw.githubusercontent.com/TheAlgorithms/website/1cd824df116b27029f17c2d1b42d81731f28a920/public/logo.svg" height="100">
</a>
<h1><a href="https://github.com/TheAlgorithms/">The Algorithms</a> - Python</h1>
<!-- Labels: -->
<!-- First row: -->
<a href="https://gitpod.io/#https://github.com/TheAlgorithms/Python">
<img src="https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod&style=flat-square" height="20" alt="Gitpod Ready-to-Code">
</a>
<a href="https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md">
<img src="https://img.shields.io/static/v1.svg?label=Contributions&message=Welcome&color=0059b3&style=flat-square" height="20" alt="Contributions Welcome">
</a>
<a href="https://www.paypal.me/TheAlgorithms/100">
<img src="https://img.shields.io/badge/Donate-PayPal-green.svg?logo=paypal&style=flat-square" height="20" alt="Donate">
</a>
<img src="https://img.shields.io/github/repo-size/TheAlgorithms/Python.svg?label=Repo%20size&style=flat-square" height="20">
<a href="https://discord.gg/c7MnfGFGa6">
<img src="https://img.shields.io/discord/808045925556682782.svg?logo=discord&colorB=7289DA&style=flat-square" height="20" alt="Discord chat">
</a>
<a href="https://gitter.im/TheAlgorithms">
<img src="https://img.shields.io/badge/Chat-Gitter-ff69b4.svg?label=Chat&logo=gitter&style=flat-square" height="20" alt="Gitter chat">
</a>
<!-- Second row: -->
<br>
<a href="https://github.com/TheAlgorithms/Python/actions">
<img src="https://img.shields.io/github/workflow/status/TheAlgorithms/Python/build?label=CI&logo=github&style=flat-square" height="20" alt="GitHub Workflow Status">
</a>
<a href="https://lgtm.com/projects/g/TheAlgorithms/Python/alerts">
<img src="https://img.shields.io/lgtm/alerts/github/TheAlgorithms/Python.svg?label=LGTM&logo=LGTM&style=flat-square" height="20" alt="LGTM">
</a>
<a href="https://github.com/pre-commit/pre-commit">
<img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" height="20" alt="pre-commit">
</a>
<a href="https://github.com/psf/black">
<img src="https://img.shields.io/static/v1?label=code%20style&message=black&color=black&style=flat-square" height="20" alt="code style: black">
</a>
<!-- Short description: -->
<h3>All algorithms implemented in Python - for education</h3>
</div>
Implementations are for learning purposes only. As they may be less efficient than the implementations in the Python standard library, use them at your discretion.
## Getting Started
Read through our [Contribution Guidelines](CONTRIBUTING.md) before you contribute.
## Community Channels
We're on [Discord](https://discord.gg/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms)! Community channels are great for you to ask questions and get help. Please join us!
## List of Algorithms
See our [directory](DIRECTORY.md) for easier navigation and better overview of the project.
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Arithmetic analysis
Arithmetic analysis is a branch of mathematics that deals with solving linear equations.
* <https://en.wikipedia.org/wiki/System_of_linear_equations>
* <https://en.wikipedia.org/wiki/Gaussian_elimination>
* <https://en.wikipedia.org/wiki/Root-finding_algorithms>
| # Arithmetic analysis
Arithmetic analysis is a branch of mathematics that deals with solving linear equations.
* <https://en.wikipedia.org/wiki/System_of_linear_equations>
* <https://en.wikipedia.org/wiki/Gaussian_elimination>
* <https://en.wikipedia.org/wiki/Root-finding_algorithms>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| ### Interest
* Compound Interest: "Compound interest is calculated by multiplying the initial principal amount by one plus the annual interest rate raised to the number of compound periods minus one." [Compound Interest](https://www.investopedia.com/)
* Simple Interest: "Simple interest paid or received over a certain period is a fixed percentage of the principal amount that was borrowed or lent. " [Simple Interest](https://www.investopedia.com/)
| ### Interest
* Compound Interest: "Compound interest is calculated by multiplying the initial principal amount by one plus the annual interest rate raised to the number of compound periods minus one." [Compound Interest](https://www.investopedia.com/)
* Simple Interest: "Simple interest paid or received over a certain period is a fixed percentage of the principal amount that was borrowed or lent. " [Simple Interest](https://www.investopedia.com/)
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # A naive recursive implementation of 0-1 Knapsack Problem
This overview is taken from:
https://en.wikipedia.org/wiki/Knapsack_problem
---
## Overview
The knapsack problem is a problem in combinatorial optimization: Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. It derives its name from the problem faced by someone who is constrained by a fixed-size knapsack and must fill it with the most valuable items. The problem often arises in resource allocation where the decision makers have to choose from a set of non-divisible projects or tasks under a fixed budget or time constraint, respectively.
The knapsack problem has been studied for more than a century, with early works dating as far back as 1897 The name "knapsack problem" dates back to the early works of mathematician Tobias Dantzig (1884–1956), and refers to the commonplace problem of packing the most valuable or useful items without overloading the luggage.
---
## Documentation
This module uses docstrings to enable the use of Python's in-built `help(...)` function.
For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`.
---
## Usage
Import the module `knapsack.py` from the **.** directory into your project.
---
## Tests
`.` contains Python unit tests which can be run with `python3 -m unittest -v`.
| # A naive recursive implementation of 0-1 Knapsack Problem
This overview is taken from:
https://en.wikipedia.org/wiki/Knapsack_problem
---
## Overview
The knapsack problem is a problem in combinatorial optimization: Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. It derives its name from the problem faced by someone who is constrained by a fixed-size knapsack and must fill it with the most valuable items. The problem often arises in resource allocation where the decision makers have to choose from a set of non-divisible projects or tasks under a fixed budget or time constraint, respectively.
The knapsack problem has been studied for more than a century, with early works dating as far back as 1897 The name "knapsack problem" dates back to the early works of mathematician Tobias Dantzig (1884–1956), and refers to the commonplace problem of packing the most valuable or useful items without overloading the luggage.
---
## Documentation
This module uses docstrings to enable the use of Python's in-built `help(...)` function.
For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`.
---
## Usage
Import the module `knapsack.py` from the **.** directory into your project.
---
## Tests
`.` contains Python unit tests which can be run with `python3 -m unittest -v`.
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Conversion
Conversion programs convert a type of data, a number from a numerical base or unit into one of another type, base or unit, e.g. binary to decimal, integer to string or foot to meters.
* <https://en.wikipedia.org/wiki/Data_conversion>
* <https://en.wikipedia.org/wiki/Transcoding>
| # Conversion
Conversion programs convert a type of data, a number from a numerical base or unit into one of another type, base or unit, e.g. binary to decimal, integer to string or foot to meters.
* <https://en.wikipedia.org/wiki/Data_conversion>
* <https://en.wikipedia.org/wiki/Transcoding>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Contributing guidelines
## Before contributing
Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms).
## Contributing
### Contributor
We are very happy that you consider implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that:
- You did your work - no plagiarism allowed
- Any plagiarized work will not be merged.
- Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged
- Your submitted work fulfils or mostly fulfils our styles and standards
__New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request.
__Improving comments__ and __writing proper tests__ are also highly welcome.
### Contribution
We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work.
Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help.
Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged.
#### What is an Algorithm?
An Algorithm is one or more functions (or classes) that:
* take one or more inputs,
* perform some internal calculations or data manipulations,
* return one or more outputs,
* have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`).
Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs.
Algorithms should:
* have intuitive class and function names that make their purpose clear to readers
* use Python naming conventions and intuitive variable names to ease comprehension
* be flexible to take different input values
* have Python type hints for their input parameters and return values
* raise Python exceptions (`ValueError`, etc.) on erroneous input values
* have docstrings with clear explanations and/or URLs to source materials
* contain doctests that test both valid and erroneous input values
* return all calculation results instead of printing or plotting them
Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value.
#### Pre-commit plugin
Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style:
```bash
python3 -m pip install pre-commit # only required the first time
pre-commit install
```
That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files:
```bash
pre-commit run --all-files --show-diff-on-failure
```
#### Coding Style
We want your work to be readable by others; therefore, we encourage you to note the following:
- Please write in Python 3.9+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will.
- Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments.
- Single letter variable names are *old school* so please avoid them unless their life only spans a few lines.
- Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not.
- Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc.
- We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read.
- Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it,
```bash
python3 -m pip install black # only required the first time
black .
```
- All submissions will need to pass the test `flake8 . --ignore=E203,W503 --max-line-length=88` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request.
```bash
python3 -m pip install flake8 # only required the first time
flake8 . --ignore=E203,W503 --max-line-length=88 --show-source
```
- Original code submission require docstrings or comments to describe your work.
- More on docstrings and comments:
If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader.
The following are considered to be bad and may be requested to be improved:
```python
x = x + 2 # increased by 2
```
This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code.
We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example:
```python
def sum_ab(a, b):
"""
Return the sum of two integers a and b.
"""
return a + b
```
- Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_.
```python
def sum_ab(a, b):
"""
Return the sum of two integers a and b
>>> sum_ab(2, 2)
4
>>> sum_ab(-2, 3)
1
>>> sum_ab(4.9, 5.1)
10.0
"""
return a + b
```
These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass:
```bash
python3 -m doctest -v my_submission.py
```
The use of the Python builtin `input()` function is __not__ encouraged:
```python
input('Enter your input:')
# Or even worse...
input = eval(input("Enter your input: "))
```
However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in:
```python
starting_value = int(input("Please enter a starting value: ").strip())
```
The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission.
```python
def sum_ab(a: int, b: int) -> int:
return a + b
```
Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file.
- [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain.
- Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms.
- If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission.
#### Other Requirements for Submissions
- If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library.
- The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter).
- Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts.
- Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure.
- If possible, follow the standard *within* the folder you are submitting to.
- If you have modified/added code work, make sure the code compiles before submitting.
- If you have modified/added documentation work, ensure your language is concise and contains no grammar errors.
- Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes.
- Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended).
- All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so.
- Most importantly,
- __Be consistent in the use of these guidelines when submitting.__
- __Join__ [Gitter](https://gitter.im/TheAlgorithms) __now!__
- Happy coding!
Writer [@poyea](https://github.com/poyea), Jun 2019.
| # Contributing guidelines
## Before contributing
Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms).
## Contributing
### Contributor
We are very happy that you consider implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that:
- You did your work - no plagiarism allowed
- Any plagiarized work will not be merged.
- Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged
- Your submitted work fulfils or mostly fulfils our styles and standards
__New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request.
__Improving comments__ and __writing proper tests__ are also highly welcome.
### Contribution
We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work.
Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help.
Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged.
#### What is an Algorithm?
An Algorithm is one or more functions (or classes) that:
* take one or more inputs,
* perform some internal calculations or data manipulations,
* return one or more outputs,
* have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`).
Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs.
Algorithms should:
* have intuitive class and function names that make their purpose clear to readers
* use Python naming conventions and intuitive variable names to ease comprehension
* be flexible to take different input values
* have Python type hints for their input parameters and return values
* raise Python exceptions (`ValueError`, etc.) on erroneous input values
* have docstrings with clear explanations and/or URLs to source materials
* contain doctests that test both valid and erroneous input values
* return all calculation results instead of printing or plotting them
Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value.
#### Pre-commit plugin
Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style:
```bash
python3 -m pip install pre-commit # only required the first time
pre-commit install
```
That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files:
```bash
pre-commit run --all-files --show-diff-on-failure
```
#### Coding Style
We want your work to be readable by others; therefore, we encourage you to note the following:
- Please write in Python 3.9+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will.
- Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments.
- Single letter variable names are *old school* so please avoid them unless their life only spans a few lines.
- Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not.
- Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc.
- We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read.
- Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it,
```bash
python3 -m pip install black # only required the first time
black .
```
- All submissions will need to pass the test `flake8 . --ignore=E203,W503 --max-line-length=88` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request.
```bash
python3 -m pip install flake8 # only required the first time
flake8 . --ignore=E203,W503 --max-line-length=88 --show-source
```
- Original code submission require docstrings or comments to describe your work.
- More on docstrings and comments:
If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader.
The following are considered to be bad and may be requested to be improved:
```python
x = x + 2 # increased by 2
```
This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code.
We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example:
```python
def sum_ab(a, b):
"""
Return the sum of two integers a and b.
"""
return a + b
```
- Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_.
```python
def sum_ab(a, b):
"""
Return the sum of two integers a and b
>>> sum_ab(2, 2)
4
>>> sum_ab(-2, 3)
1
>>> sum_ab(4.9, 5.1)
10.0
"""
return a + b
```
These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass:
```bash
python3 -m doctest -v my_submission.py
```
The use of the Python builtin `input()` function is __not__ encouraged:
```python
input('Enter your input:')
# Or even worse...
input = eval(input("Enter your input: "))
```
However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in:
```python
starting_value = int(input("Please enter a starting value: ").strip())
```
The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission.
```python
def sum_ab(a: int, b: int) -> int:
return a + b
```
Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file.
- [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain.
- Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms.
- If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission.
#### Other Requirements for Submissions
- If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library.
- The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter).
- Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts.
- Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure.
- If possible, follow the standard *within* the folder you are submitting to.
- If you have modified/added code work, make sure the code compiles before submitting.
- If you have modified/added documentation work, ensure your language is concise and contains no grammar errors.
- Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes.
- Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended).
- All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so.
- Most importantly,
- __Be consistent in the use of these guidelines when submitting.__
- __Join__ [Gitter](https://gitter.im/TheAlgorithms) __now!__
- Happy coding!
Writer [@poyea](https://github.com/poyea), Jun 2019.
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Bit manipulation
Bit manipulation is the act of manipulating bits to detect errors (hamming code), encrypts and decrypts messages (more on that in the 'ciphers' folder) or just do anything at the lowest level of your computer.
* <https://en.wikipedia.org/wiki/Bit_manipulation>
* <https://docs.python.org/3/reference/expressions.html#binary-bitwise-operations>
* <https://docs.python.org/3/reference/expressions.html#unary-arithmetic-and-bitwise-operations>
* <https://docs.python.org/3/library/stdtypes.html#bitwise-operations-on-integer-types>
* <https://wiki.python.org/moin/BitManipulation>
* <https://wiki.python.org/moin/BitwiseOperators>
* <https://www.tutorialspoint.com/python3/bitwise_operators_example.htm>
| # Bit manipulation
Bit manipulation is the act of manipulating bits to detect errors (hamming code), encrypts and decrypts messages (more on that in the 'ciphers' folder) or just do anything at the lowest level of your computer.
* <https://en.wikipedia.org/wiki/Bit_manipulation>
* <https://docs.python.org/3/reference/expressions.html#binary-bitwise-operations>
* <https://docs.python.org/3/reference/expressions.html#unary-arithmetic-and-bitwise-operations>
* <https://docs.python.org/3/library/stdtypes.html#bitwise-operations-on-integer-types>
* <https://wiki.python.org/moin/BitManipulation>
* <https://wiki.python.org/moin/BitwiseOperators>
* <https://www.tutorialspoint.com/python3/bitwise_operators_example.htm>
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Normal Distribution QuickSort
QuickSort Algorithm where the pivot element is chosen randomly between first and last elements of the array, and the array elements are taken from Standard Normal Distribution.
## Array elements
The array elements are taken from a Standard Normal Distribution, having mean = 0 and standard deviation = 1.
### The code
```python
>>> import numpy as np
>>> from tempfile import TemporaryFile
>>> outfile = TemporaryFile()
>>> p = 100 # 100 elements are to be sorted
>>> mu, sigma = 0, 1 # mean and standard deviation
>>> X = np.random.normal(mu, sigma, p)
>>> np.save(outfile, X)
>>> print('The array is')
>>> print(X)
```
------
#### The distribution of the array elements
```python
>>> mu, sigma = 0, 1 # mean and standard deviation
>>> s = np.random.normal(mu, sigma, p)
>>> count, bins, ignored = plt.hist(s, 30, normed=True)
>>> plt.plot(bins , 1/(sigma * np.sqrt(2 * np.pi)) *np.exp( - (bins - mu)**2 / (2 * sigma**2) ),linewidth=2, color='r')
>>> plt.show()
```
------

------
## Comparing the numbers of comparisons
We can plot the function for Checking 'The Number of Comparisons' taking place between Normal Distribution QuickSort and Ordinary QuickSort:
```python
>>> import matplotlib.pyplot as plt
# Normal Distribution QuickSort is red
>>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,6,15,43,136,340,800,2156,6821,16325],linewidth=2, color='r')
# Ordinary QuickSort is green
>>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,4,16,67,122,362,949,2131,5086,12866],linewidth=2, color='g')
>>> plt.show()
```
| # Normal Distribution QuickSort
QuickSort Algorithm where the pivot element is chosen randomly between first and last elements of the array, and the array elements are taken from Standard Normal Distribution.
## Array elements
The array elements are taken from a Standard Normal Distribution, having mean = 0 and standard deviation = 1.
### The code
```python
>>> import numpy as np
>>> from tempfile import TemporaryFile
>>> outfile = TemporaryFile()
>>> p = 100 # 100 elements are to be sorted
>>> mu, sigma = 0, 1 # mean and standard deviation
>>> X = np.random.normal(mu, sigma, p)
>>> np.save(outfile, X)
>>> print('The array is')
>>> print(X)
```
------
#### The distribution of the array elements
```python
>>> mu, sigma = 0, 1 # mean and standard deviation
>>> s = np.random.normal(mu, sigma, p)
>>> count, bins, ignored = plt.hist(s, 30, normed=True)
>>> plt.plot(bins , 1/(sigma * np.sqrt(2 * np.pi)) *np.exp( - (bins - mu)**2 / (2 * sigma**2) ),linewidth=2, color='r')
>>> plt.show()
```
------

------
## Comparing the numbers of comparisons
We can plot the function for Checking 'The Number of Comparisons' taking place between Normal Distribution QuickSort and Ordinary QuickSort:
```python
>>> import matplotlib.pyplot as plt
# Normal Distribution QuickSort is red
>>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,6,15,43,136,340,800,2156,6821,16325],linewidth=2, color='r')
# Ordinary QuickSort is green
>>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,4,16,67,122,362,949,2131,5086,12866],linewidth=2, color='g')
>>> plt.show()
```
| -1 |
TheAlgorithms/Python | 6,236 | Upgrade GitHub Actions | ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| cclauss | "2022-07-06T20:10:25Z" | "2022-07-07T03:25:25Z" | 9135a1f41192ebe1d835282a1465dc284359d95c | 0a0f4986e4fde05ebc2a24c9cc2cd6b8200b8df1 | Upgrade GitHub Actions. ### Describe your change:
* [ ] Add an algorithm?
* [ ] Fix a bug or typo in an existing algorithm?
* [ ] Documentation change?
* [x] Upgrade automated testing
### Checklist:
* [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md).
* [x] This pull request is all my own work -- I have not plagiarized.
* [ ] I know that pull requests will not be merged if they fail the automated tests.
* [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
* [ ] All new Python files are placed inside an existing directory.
* [ ] All filenames are in all lowercase characters with no spaces or dashes.
* [ ] All functions and variable names follow Python naming conventions.
* [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html).
* [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing.
* [ ] All new algorithms have a URL in their comments that points to Wikipedia or other similar explanations.
* [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
| # Project Euler
Problems are taken from https://projecteuler.net/, the Project Euler. [Problems are licensed under CC BY-NC-SA 4.0](https://projecteuler.net/copyright).
Project Euler is a series of challenging mathematical/computer programming problems that require more than just mathematical
insights to solve. Project Euler is ideal for mathematicians who are learning to code.
The solutions will be checked by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) with the help of [this script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). The efficiency of your code is also checked. You can view the top 10 slowest solutions on GitHub Actions logs (under `slowest 10 durations`) and open a pull request to improve those solutions.
## Solution Guidelines
Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before reading the solution guidelines, make sure you read the whole [Contributing Guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) as it won't be repeated in here. If you have any doubt on the guidelines, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). You can use the [template](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#solution-template) we have provided below as your starting point but be sure to read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) part first.
### Coding Style
* Please maintain consistency in project directory and solution file names. Keep the following points in mind:
* Create a new directory only for the problems which do not exist yet.
* If you create a new directory, please create an empty `__init__.py` file inside it as well.
* Please name the project **directory** as `problem_<problem_number>` where `problem_number` should be filled with 0s so as to occupy 3 digits. Example: `problem_001`, `problem_002`, `problem_067`, `problem_145`, and so on.
* Please provide a link to the problem and other references, if used, in the **module-level docstring**.
* All imports should come ***after*** the module-level docstring.
* You can have as many helper functions as you want but there should be one main function called `solution` which should satisfy the conditions as stated below:
* It should contain positional argument(s) whose default value is the question input. Example: Please take a look at [Problem 1](https://projecteuler.net/problem=1) where the question is to *Find the sum of all the multiples of 3 or 5 below 1000.* In this case the main solution function will be `solution(limit: int = 1000)`.
* When the `solution` function is called without any arguments like so: `solution()`, it should return the answer to the problem.
* Every function, which includes all the helper functions, if any, and the main solution function, should have `doctest` in the function docstring along with a brief statement mentioning what the function is about.
* There should not be a `doctest` for testing the answer as that is done by our GitHub Actions build using this [script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). Keeping in mind the above example of [Problem 1](https://projecteuler.net/problem=1):
```python
def solution(limit: int = 1000):
"""
A brief statement mentioning what the function is about.
You can have a detailed explanation about the solution method in the
module-level docstring.
>>> solution(1)
...
>>> solution(16)
...
>>> solution(100)
...
"""
```
### Solution Template
You can use the below template as your starting point but please read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) first to understand how the template works.
Please change the name of the helper functions accordingly, change the parameter names with a descriptive one, replace the content within `[square brackets]` (including the brackets) with the appropriate content.
```python
"""
Project Euler Problem [problem number]: [link to the original problem]
... [Entire problem statement] ...
... [Solution explanation - Optional] ...
References [Optional]:
- [Wikipedia link to the topic]
- [Stackoverflow link]
...
"""
import module1
import module2
...
def helper1(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]:
"""
A brief statement explaining what the function is about.
... A more elaborate description ... [Optional]
...
[Doctest]
...
"""
...
# calculations
...
return
# You can have multiple helper functions but the solution function should be
# after all the helper functions ...
def solution(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]:
"""
A brief statement mentioning what the function is about.
You can have a detailed explanation about the solution in the
module-level docstring.
...
[Doctest as mentioned above]
...
"""
...
# calculations
...
return answer
if __name__ == "__main__":
print(f"{solution() = }")
```
| # Project Euler
Problems are taken from https://projecteuler.net/, the Project Euler. [Problems are licensed under CC BY-NC-SA 4.0](https://projecteuler.net/copyright).
Project Euler is a series of challenging mathematical/computer programming problems that require more than just mathematical
insights to solve. Project Euler is ideal for mathematicians who are learning to code.
The solutions will be checked by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) with the help of [this script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). The efficiency of your code is also checked. You can view the top 10 slowest solutions on GitHub Actions logs (under `slowest 10 durations`) and open a pull request to improve those solutions.
## Solution Guidelines
Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before reading the solution guidelines, make sure you read the whole [Contributing Guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) as it won't be repeated in here. If you have any doubt on the guidelines, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). You can use the [template](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#solution-template) we have provided below as your starting point but be sure to read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) part first.
### Coding Style
* Please maintain consistency in project directory and solution file names. Keep the following points in mind:
* Create a new directory only for the problems which do not exist yet.
* If you create a new directory, please create an empty `__init__.py` file inside it as well.
* Please name the project **directory** as `problem_<problem_number>` where `problem_number` should be filled with 0s so as to occupy 3 digits. Example: `problem_001`, `problem_002`, `problem_067`, `problem_145`, and so on.
* Please provide a link to the problem and other references, if used, in the **module-level docstring**.
* All imports should come ***after*** the module-level docstring.
* You can have as many helper functions as you want but there should be one main function called `solution` which should satisfy the conditions as stated below:
* It should contain positional argument(s) whose default value is the question input. Example: Please take a look at [Problem 1](https://projecteuler.net/problem=1) where the question is to *Find the sum of all the multiples of 3 or 5 below 1000.* In this case the main solution function will be `solution(limit: int = 1000)`.
* When the `solution` function is called without any arguments like so: `solution()`, it should return the answer to the problem.
* Every function, which includes all the helper functions, if any, and the main solution function, should have `doctest` in the function docstring along with a brief statement mentioning what the function is about.
* There should not be a `doctest` for testing the answer as that is done by our GitHub Actions build using this [script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). Keeping in mind the above example of [Problem 1](https://projecteuler.net/problem=1):
```python
def solution(limit: int = 1000):
"""
A brief statement mentioning what the function is about.
You can have a detailed explanation about the solution method in the
module-level docstring.
>>> solution(1)
...
>>> solution(16)
...
>>> solution(100)
...
"""
```
### Solution Template
You can use the below template as your starting point but please read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) first to understand how the template works.
Please change the name of the helper functions accordingly, change the parameter names with a descriptive one, replace the content within `[square brackets]` (including the brackets) with the appropriate content.
```python
"""
Project Euler Problem [problem number]: [link to the original problem]
... [Entire problem statement] ...
... [Solution explanation - Optional] ...
References [Optional]:
- [Wikipedia link to the topic]
- [Stackoverflow link]
...
"""
import module1
import module2
...
def helper1(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]:
"""
A brief statement explaining what the function is about.
... A more elaborate description ... [Optional]
...
[Doctest]
...
"""
...
# calculations
...
return
# You can have multiple helper functions but the solution function should be
# after all the helper functions ...
def solution(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]:
"""
A brief statement mentioning what the function is about.
You can have a detailed explanation about the solution in the
module-level docstring.
...
[Doctest as mentioned above]
...
"""
...
# calculations
...
return answer
if __name__ == "__main__":
print(f"{solution() = }")
```
| -1 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.