Improve the performance of the link checker process
Currently, the link checker routine loops through a list of all the URLs it needs to check and sends them requests one at a time. Since this blocks the process while it waits for a response from the web server, we could improve the performance of the link checker significantly by testing multiple URLs in parallel asyncronously.
This will require some research to determine how many web requests we can make at once, since the limit may differ depending on the system, its configuration, or the domain name of the links being checked.
Jessica Bower commented
If we could at least have a way to "ignore" links in the Invalid Links queue that were actually valid and have it stick (i.e. not show up again 24 hours later), that would help. Or just have a way to correct the system to tell them the link it thinks is invalid, is actually valid.
Thomas Rouleau commented
(Kerry on behalf of:) U Ottawa would also like this to work where it was checking from a staff client (or similar location), instead of the server, because our server's internet access is restricted and all links appear as broken. This would also help with the issue of Atlas-hosted servers not being able to reach/authenticate web addresses that are restricted to campus IPs.
Theresa Spangler commented
Our link checker is consistently showing URLs in the Invalid queue that a) do work, b) appear again after the Retry process, c) are all from the same domain, and d) said domain is already in our IgnoreDomains list in the Customization Manager. Our ezproxy language may be impacting that last factor. It's more annoying for our staff than an urgent matter of user access, but still one I'd like tweaked/fixed. I will add that Atlas staff have been great in their previous and current investigations of our particular issue!
I've got 60+ things stuck in my Link Checker queue. I've retryed them several times and they always work. They distort my ability to see if there is a new issue that has arisen. Thanks for considering this needed change.
Genie Shropshire commented
If the Ares server is hosted, it can't connect to any proxied items to verify they work.
Jenny Vitti commented
In the Link Checker, add the ability to mark a URL as cleared/working for the semester. Many of the URLs caught by the Link Checker work properly in regular browsers accessed by humans, but show up persistently as errors in the Link Checker, even after passing them through the Retry process. Since the Link Checker is tied to ItemID, cloned items would still come back to the queue in future semesters, which would ensure that each URL was checked every time it's used.