Home
Go Web Scraping Quick Start Guide
Barnes and Noble
Go Web Scraping Quick Start Guide
Current price: $26.99
Barnes and Noble
Go Web Scraping Quick Start Guide
Current price: $26.99
Size: Paperback
Loading Inventory...
*Product information may vary - to confirm product availability, pricing, shipping and return information please contact Barnes and Noble
Learn how some Go-specific language features help to simplify building web scrapers along with common pitfalls and best practices regarding web scraping.
Use Go libraries like Goquery and Colly to scrape the web
Common pitfalls and best practices to effectively scrape and crawl
Learn how to scrape using the Go concurrency model
Web scraping is the process of extracting information from the web using various tools that perform scraping and crawling. Go is emerging as the language of choice for scraping using a variety of libraries. This book will quickly explain to you, how to scrape data data from various websites using Go libraries such as Colly and Goquery.
The book starts with an introduction to the use cases of building a web scraper and the main features of the Go programming language, along with setting up a Go environment. It then moves on to HTTP requests and responses and talks about how Go handles them. You will also learn about a number of basic web scraping etiquettes.
You will be taught how to navigate through a website, using a breadth-first and then a depth-first search, as well as find and follow links. You will get to know about the ways to track history in order to avoid loops and to protect your web scraper using proxies.
Finally the book will cover the Go concurrency model, and how to run scrapers in parallel, along with large-scale distributed web scraping.
Implement Cache-Control to avoid unnecessary network calls
Coordinate concurrent scrapers
Design a custom, larger-scale scraping system
Scrape basic HTML pages with Colly and JavaScript pages with chromedp
Discover how to search using the "strings" and "regexp" packages
Set up a Go development environment
Retrieve information from an HTML document
Protect your web scraper from being blocked by using proxies
Control web browsers to scrape JavaScript sites
Data scientists, and web developers with a basic knowledge of Golang wanting to collect web data and analyze them for effective reporting and visualization.
Use Go libraries like Goquery and Colly to scrape the web
Common pitfalls and best practices to effectively scrape and crawl
Learn how to scrape using the Go concurrency model
Web scraping is the process of extracting information from the web using various tools that perform scraping and crawling. Go is emerging as the language of choice for scraping using a variety of libraries. This book will quickly explain to you, how to scrape data data from various websites using Go libraries such as Colly and Goquery.
The book starts with an introduction to the use cases of building a web scraper and the main features of the Go programming language, along with setting up a Go environment. It then moves on to HTTP requests and responses and talks about how Go handles them. You will also learn about a number of basic web scraping etiquettes.
You will be taught how to navigate through a website, using a breadth-first and then a depth-first search, as well as find and follow links. You will get to know about the ways to track history in order to avoid loops and to protect your web scraper using proxies.
Finally the book will cover the Go concurrency model, and how to run scrapers in parallel, along with large-scale distributed web scraping.
Implement Cache-Control to avoid unnecessary network calls
Coordinate concurrent scrapers
Design a custom, larger-scale scraping system
Scrape basic HTML pages with Colly and JavaScript pages with chromedp
Discover how to search using the "strings" and "regexp" packages
Set up a Go development environment
Retrieve information from an HTML document
Protect your web scraper from being blocked by using proxies
Control web browsers to scrape JavaScript sites
Data scientists, and web developers with a basic knowledge of Golang wanting to collect web data and analyze them for effective reporting and visualization.