Everybody thinks they understand depth-first search. The algorithm is indeed simple. But — I think — it is deceptively simple. In this post I play with some code.
Given a graph like
g = { 0 : [1, 2] , 1 : [0, 3] , 2 : [0, 3] , 3 : [1, 2] }
you are most likely to see DFS implemented as some variation of the following:
def dfs_a(g, root): seen = set() def rec(x): seen.add(x) for y in g[x]: if y not in seen: rec(y) rec(root)
This is not bad, but it has two issues. One is that it hides lots of cool features of DFS. Another is that Python is crap at recursion.
Let's first expand the code to expose a few properties of the algorithm.
def dfs_b(g): seen, done = {}, {} time = 0 def previsit(x): nonlocal time time += 1 seen[x] = time print('previsit',x,'at time',time) def postvisit(x): nonlocal time time += 1 done[x] = time print('postvisit',x,'at time',time) def rec(y): for z in g[y]: if z in done: print('cross arc from',y,'to',z) continue if z in seen: print('back arc from',y,'to',z) continue previsit(z) rec(z) postvisit(z) for x in g.keys(): if x in done: print('already visited',x) continue previsit(x) rec(x) postvisit(x)
A few things to notice:
- Sometimes you'll want to iterate over all vertices, instead of using one root.
- There is a time when you first get to a vertex, and a time when you leave the vertex. These times are well-bracketed. That is if the times for vertex $x$ are $(a_x,b_x)$ and those for vertex $y$ are $(a_y,b_y)$, then these two intervals either are disjoint, or one includes the other.
- The post-order of the vertices is what you get if you grep for postvisit. The pre-order of the vertices is what you get if you grep for previsit. The post-order is useful, for example, in Kosaraju's algorithm to find strongly connected components.
- There is a difference between back arcs and cross arcs! To distinguish them, you need two bits per vertex. In this implementation I used the dictionaries seen and done. If you know you don't have cycles, you can get rid of seen. (You might know this, say, if you look at the graph of GIT commits.) If you know you don't have cross arcs, you can get rid of done. (Note: A graph is reducible iff its set of back arcs doesn't depend on the order of the for loops above.)
What about complexity? If the graph has $m$ arcs and $n$ vertices, then the time is $\Theta(m+n)$ and the space is $\Theta(n)$.
Now on to the second problem. Run this code:
n = 0 try: def go(): global n n += 1 go() go() except: print('gave up at',n)
On my computer it says gave up at 999. Even puny graphs have 999 vertices!
But we can make the call stack explicit.
For this, we need to notice what state does rec
have.
Very little: the current vertex and the state of the iterator.
The transformation into the code below is almost automatic.
def dfs_c(g): seen, done = {}, {} time = 0 def previsit(x): nonlocal time time += 1 seen[x] = time print('previsit',x,'at time',time) def postvisit(x): nonlocal time time += 1 done[x] = time print('postvisit',x,'at time',time) for x, ys in g.items(): if x in done: print('already visited',x) continue previsit(x) stack = [(x, ys.__iter__())] while stack: y, i = stack.pop() try: z = next(i) stack.append((y, i)) if z in done: print('cross arc from',y,'to',z) continue if z in seen: print('back arc from',y,'to',z) continue previsit(z) stack.append((z, g[z].__iter__())) except StopIteration: postvisit(y)
I must admit: This looks a bit scary. One reason why it does is that it keeps explicit all these distinction that could be of use. For example, the post-visiting is useful for computing SCCs, but sometimes is not. So, it's now time to strip all these, and go back to the first piece of code. The only modification I'll keep is the change from recursion to a loop.
def dfs_d(g, root): seen = set([root]) stack = [(root, g[root].__iter__())] while stack: try: x, i = stack.pop() y = next(i) stack.append((x, i)) if y not in seen: seen.add(y) stack.append((y, g[y].__iter__())) except StopIteration: pass
This is almost simple. But, hey, if I google for ‘depth-first search python’, then I get this code by Edward Mann.
def dfs(graph, start): visited, stack = set(), [start] while stack: vertex = stack.pop() if vertex not in visited: visited.add(vertex) stack.extend(graph[vertex] - visited) return visited
This code is even simpler! Yes, it is, and you should use it: compared to the other top-results from google that I saw, this one is clearly the best. It has only one problem: The memory use is $\Theta(m)$ rather than $\Theta(n)$. This means that for dense graphs you might want to write a bit of extra code to deal with iterators.
What's the point of the other variants? The point is that they are more general. By studying them, you should get a better feeling of what DFS does. For example, you might be able to decode this:
Algorithm for finding strongly connected components:
- using DFS on the original graph, sort vertices in reverse post-order
- using DFS on the reversed graph, find components; for the outer loop of this DFS use the order built at step 1
1 comment:
Really a great puzzle, different from many common ones with a interesting problem. Can I know why you used DFS instead of BFS? I think most people prefer DFS over BFS. But is using DFS due to "The recursive method of the Depth-First Search algorithm is implemented using stack." (source: DFS in Python) or because "DFS will take less memory than BFS" (source: DFS vs BFS)?
Post a Comment
Note: (1) You need to have third-party cookies enabled in order to comment on Blogger. (2) Better to copy your comment before hitting publish/preview. Blogger sometimes eats comments on the first try, but the second works. Crazy Blogger.