Skip to content

Instantly share code, notes, and snippets.

@danimal141
Created October 12, 2016 01:05
Show Gist options
  • Save danimal141/09ed828ce4d676faa6fa9f19aeea2867 to your computer and use it in GitHub Desktop.
Save danimal141/09ed828ce4d676faa6fa9f19aeea2867 to your computer and use it in GitHub Desktop.
A Tour of Go Exercise: Web Crawler
package main
import (
"fmt"
"sync"
)
type Fetcher interface {
// Fetch returns the body of URL and
// a slice of URLs found on that page.
Fetch(url string) (body string, urls []string, err error)
}
// Crawl uses fetcher to recursively crawl
// pages starting with url, to a maximum of depth.
func Crawl(url string, depth int, fetcher Fetcher) {
fetched := make(map[string]bool)
crawl(url, depth, fetcher, fetched)
return
}
func crawl(url string, depth int, fetcher Fetcher, fetched map[string]bool) {
if depth <= 0 {
return
}
if v, ok := fetched[url]; ok && v {
return
}
body, urls, err := fetcher.Fetch(url)
if err != nil {
fmt.Println(err)
return
}
fmt.Printf("found: %s %q\n", url, body)
fetched[url] = true
var wg sync.WaitGroup
for _, u := range urls {
wg.Add(1)
go func(u string) {
defer wg.Done()
crawl(u, depth-1, fetcher, fetched)
}(u)
}
wg.Wait()
return
}
func main() {
Crawl("http://golang.org/", 4, fetcher)
}
// fakeFetcher is Fetcher that returns canned results.
type fakeFetcher map[string]*fakeResult
type fakeResult struct {
body string
urls []string
}
func (f fakeFetcher) Fetch(url string) (string, []string, error) {
if res, ok := f[url]; ok {
return res.body, res.urls, nil
}
return "", nil, fmt.Errorf("not found: %s", url)
}
// fetcher is a populated fakeFetcher.
var fetcher = fakeFetcher{
"http://golang.org/": &fakeResult{
"The Go Programming Language",
[]string{
"http://golang.org/pkg/",
"http://golang.org/cmd/",
},
},
"http://golang.org/pkg/": &fakeResult{
"Packages",
[]string{
"http://golang.org/",
"http://golang.org/cmd/",
"http://golang.org/pkg/fmt/",
"http://golang.org/pkg/os/",
},
},
"http://golang.org/pkg/fmt/": &fakeResult{
"Package fmt",
[]string{
"http://golang.org/",
"http://golang.org/pkg/",
},
},
"http://golang.org/pkg/os/": &fakeResult{
"Package os",
[]string{
"http://golang.org/",
"http://golang.org/pkg/",
},
},
}
@yakkeita
Copy link

yakkeita commented Apr 2, 2022

Hi,
I have a question about the code:
What guaranties you that the map fetched won't be accessed conccurently and be corrupted ?

Plus I think your variable var wg sync.WaitGroup will be redeclared everytime crawl is called, even though it won't cause any problem but I think your intent was maybe to use the same wg variable, as it's name suggest it is for waiting for a group of related routines to end their work not for creating a waitGroup for each routine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment