You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had feedback in the form of a question to open a discussion, rather than to include in this version of the Guidelines.
If a publishers uses AI in the creation of alternative descriptions, should this information be made available in the metadata as well?
If there are human checks, or no human checks on the quality of the alternative descriptions, should this be indicated in the metadata?
Would this level of information be useful for users or readers ?
The text was updated successfully, but these errors were encountered:
I think that AI is going to be used in many places not only for Alt text or image descriptions. A publisher is responsible for the content, and as part of the editorial process they stribe to make the title meets their quality standards.
I do think that if AI descriptions are provided without their editorial review, then the end user should be alerted. This would give the reader an indication that the descriptions should be treated like other AI generated descriptions, e.g. be suspicious.
the reader a
I had feedback in the form of a question to open a discussion, rather than to include in this version of the Guidelines.
If a publishers uses AI in the creation of alternative descriptions, should this information be made available in the metadata as well?
If there are human checks, or no human checks on the quality of the alternative descriptions, should this be indicated in the metadata?
Would this level of information be useful for users or readers ?
The text was updated successfully, but these errors were encountered: