-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enhance Bulk URL Analysis Error Handling #817
Comments
Hi @tsunoyu, Thanks for reporting the error. You're absolutely right that errors should be handled gracefully to ensure the process continues for the rest of the URLs in the sitemap. Unfortunately, we haven't been able to replicate the issue at our end. Please share the specific sitemap or CSV file you used so we can try to reproduce it and find a solution for the same. |
@tsunoyu, thanks for your request. This is part of our roadmap and such more robust behavior will be part of PSAT in an upcoming version. In a nutshell, the output of aggregated analyses (i.e. sitemap/csv) should be resilient to URL errors, and a summary of such errors should be part of the generated report. Will keep this issue open until we merge the corresponding changes. |
A related issue - #818 |
Hi @tsunoyu, We're excited to announce the release of PSAT v0.12.0 This version includes a significant improvement in error handling. PSAT CLI now gracefully handles errors encountered during analysis and lists the URLs with issues in the report under a dedicated “URL Issues” section. We recommend updating your PSAT CLI to the latest version (v0.12.0). You can update by Pulling the latest changes from the repository or follow the update steps outlined in our wiki. Please let us know if you have any questions. |
Feature Request: Enhance Bulk URL Analysis Error Handling
Description:
Improve the bulk URL analysis tool's error handling to ensure it continues processing even when individual URLs encounter errors. Provide detailed error reporting, including the specific URL that triggered the error, to allow users to investigate further.
Motivation:
The current behavior of the tool, halting analysis and report generation entirely when an error occurs, hinders productivity. Users need to manually identify the problematic URLs and re-run the analysis, which is time-consuming. This enhancement will improve the tool's robustness and usability.
User Story:
When analyzing 1000+ URLs in bulk I want the tool to gracefully handle errors and continue processing the remaining URLs so that it can complete the full analysis and get a comprehensive report even if some URLs fail.
I also want the tool to explicitly identify the URLs that triggered errors so that I can investigate them further if needed.
Acceptance Criteria:
Additional Information:
Which above won't tell which URL and it won't generate the report even other URLs has been completed the analysis.
This enhancement will significantly improve the user experience by making the bulk URL analysis tool more resilient and informative.
The text was updated successfully, but these errors were encountered: