Lazy use of AI leads to Amazon products called “I cannot fulfill that request”
The telltale error messages are a sign of AI-generated pablum all over the Internet.
@[email protected] dang within an hour they killed it. You guys gotta report on Amazon more
@arstechnica
@[email protected] Oh great fun! And not just on Amazon I notice. Many webshops are already doing their product descriptions with AI it seems… AND not even looking at the results. #sigh 🥴
@[email protected] Cool… lol
@[email protected] Use of AI is lazy use by definition.
@[email protected]
As someone who sells through Amazon and relies on those earnings, this is really disheartening. Over the last 2 yrs it’s become so competitive due to their ‘sponsor’ manipulattion and now this … FFS.They have used auto-translation for years too, with hillarious results. It is obvious that no human is validating the output on non-US/EN Amazon sites.
@[email protected] I‘m sorry Dave, I‘m afraid I can‘t do that
@[email protected] I did a search for exactly that on Amazon; found nothing.