Question: Link Extractor For this project you are to build a command-line tool that can extract the links from any URL on the web. The tool
Link Extractor
For this project you are to build a command-line tool that can extract the links from any URL on the web. The tool should be implemented in standard C++, although it is perfectly reasonable to consume API's specific to the operating system you are building on since standard C++ doesn't provide native support for networking. The tool should have the following behavior when run from the command line (text doesn't have to match exactly, the example just gives a general idea of the functionality): > .\tool.exe > Please enter a url address: http://example.com > The following links were found: > http://example.com/page1 > http://example.com/page2 > http://google.com/search > http://mdn.com/asdf > Done! In order to achieve this, you will need to design a program so that it is able to read the url entered at the console, download the contents of that url over the network, and parse the HTML result for anchor tags containing href elements. If you need a refresher on how anchor tags work and how HTML in general is structured, you may want to look at MDN The program should be robust, and not crash, even when given an invalid url, or when given a url that points to something other than HTML. If the program is not able to parse html from the contents returned from the internet, it is fine to silently abort, or print an error message, but crashing is not acceptable. The program should be designed well. You should take advantage of classes, containers, and modern C++ algorithms that we have learned about in class.
I need this for visual studio C++. Thank you
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
