Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 21c93bf

Browse filesBrowse files
committed
day 16
1 parent f3914a5 commit 21c93bf
Copy full SHA for 21c93bf

File tree

3 files changed

+90
-1
lines changed
Filter options

3 files changed

+90
-1
lines changed

‎README.md

Copy file name to clipboardExpand all lines: README.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
|[13](https://adventofcode.com/2020/day/13)|Shuttle Search|[py](/day13/main.py)|
1818
|[14](https://adventofcode.com/2020/day/14)|Docking Data|[py](/day14/main.py)|
1919
|[15](https://adventofcode.com/2020/day/15)|Rambunctious Recitation|[py](/day15/main.py), [alt](/day15/alt.py)|
20-
|[16](https://adventofcode.com/2020/day/16)|-|-|
20+
|[16](https://adventofcode.com/2020/day/16)|Ticket Translation|[py](/day16/main.py), [alt](/day16/alt.py)|
2121
|[17](https://adventofcode.com/2020/day/17)|-|-|
2222
|[18](https://adventofcode.com/2020/day/18)|-|-|
2323
|[19](https://adventofcode.com/2020/day/19)|-|-|

‎day16/alt.py

Copy file name to clipboard
+29Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
import math
2+
import re
3+
4+
import numpy as np
5+
from scipy.sparse import csr_matrix
6+
from scipy.sparse.csgraph import maximum_bipartite_matching
7+
8+
with open('input.txt') as f:
9+
ls = [line.strip() for line in f]
10+
11+
ranges = [list(map(int, re.findall('\d+', x))) for x in ls[:20]]
12+
your = np.array([int(x) for x in ls[22].split(',')], dtype=np.int64)
13+
nearby = [list(map(int, re.findall('\d+', x))) for x in ls[25:]]
14+
15+
# Part one
16+
valid = set()
17+
for t1, t2, t3, t4 in ranges:
18+
valid |= set(range(t1, t2 + 1))
19+
valid |= set(range(t3, t4 + 1))
20+
21+
print(sum(n for l in nearby for n in l if n not in valid))
22+
23+
# Part two
24+
valids = [l for l in nearby if all(n in valid for n in l)]
25+
loc = [[all((t1 <= l[j] <= t2) or (t3 <= l[j] <= t4) for l in valids)
26+
for t1, t2, t3, t4 in ranges]
27+
for j in range(20)]
28+
m = maximum_bipartite_matching(csr_matrix(loc))
29+
print(your[m[:6]].prod())

‎day16/main.py

Copy file name to clipboard
+60Lines changed: 60 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,60 @@
1+
import numpy as np
2+
3+
with open("input.txt") as f:
4+
lines = [x.strip() for x in f]
5+
6+
sep1 = "your ticket:"
7+
sep2 = "nearby tickets:"
8+
9+
part1 = lines[:lines.index(sep1)-1]
10+
part2 = [int(n) for n in lines[lines.index(sep1)+1].split(",")]
11+
part3 = lines[lines.index(sep2)+1:]
12+
13+
alloweds = []
14+
for line in part1:
15+
a = set()
16+
ranges = line.split(": ")[1]
17+
for r in ranges.split(" or "):
18+
bounds = r.split("-")
19+
lb, ub = int(bounds[0]), int(bounds[1])
20+
for i in range(lb, ub+1):
21+
a.add(i)
22+
alloweds.append(a)
23+
24+
score = []
25+
c = len(part1)
26+
poss_mat = np.ones((c,c), dtype=np.uint8)
27+
28+
for line in part3:
29+
okay = True
30+
numbers = [int(n) for n in line.split(",")]
31+
for n in numbers:
32+
if not any(n in a for a in alloweds):
33+
score.append(n)
34+
okay = False
35+
if okay:
36+
for field_idx, n in enumerate(numbers):
37+
for j, a in enumerate(alloweds):
38+
if n not in a:
39+
poss_mat[field_idx][j] = 0
40+
41+
print(sum(score))
42+
43+
change = True
44+
seen = set()
45+
while change:
46+
change = False
47+
for i, row in enumerate(poss_mat):
48+
if sum(row) == 1 and i not in seen:
49+
seen.add(i)
50+
change = True
51+
col = np.argmax(row)
52+
poss_mat[:, col] = 0
53+
poss_mat[i, col] = 1
54+
55+
res = 1
56+
for j, row in enumerate(poss_mat):
57+
if np.argmax(row) < 6:
58+
res *= part2[j]
59+
60+
print(res)

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.