I wanted to start a discussion about the use of AI-generated solutions on Programming.dev. Personally, I’ve found that AI-powered tools have been incredibly helpful in solving programming questions. I won’t name any specific commercial software, but I use one that combines GPT-4 and web search to get more factual information. I write some answers I think I might revisit to the ShareGPT community, but I would prefer posting programming solutions to this instance. However, I’m not sure if AI-generated solutions are welcomed on programming.dev. I’d love to hear your thoughts on this. If AI-generated responses are accepted, how should we format the answers, should we just copy paste without quoting, should we quote the model, just mention that it’s AI-generated,…?
If you mean solutions as in answers to questions I would be against it, LLMs have a habit of spewing wrong information that looks correct. This happens way more when you tell it to write code, it could end up unoptimized, misleading, or straight up wrong. I wouldn’t want an AI to answer my question and then feeling like I’m forced to triple check its answers and make sure it’s hallucinating.
There’s also the point of “if people wanted AI answers, they would be asking chatbots and not post on a community of people”
Yep full support for these points:
- having to triple check the AI output
- output is confidently incorrect which furthers the first problem
- If I want AI answers, I ask the bot directly and thus know how to interpret the result
But: we cannot ensure no AI content is posted as answer. Just putting up the rule that we don’t want it.
-
We want your personal experiences with the thing you want to post / reply. Else why wouldn’t a person would use the AI tool directly.
-
I would personally like a post where people show the way to use these tools efficiently rather than using it to post comment / post.
-
I’d argue stackoverflow got a fair point for their policy. I’d ask the bot directly if that’s what I wanted.
I’d say as long as you’ve factually verified the answer yourself I don’t think using an LLM to help you answer a question is bad, its about the same as using a search engine, but please don’t just ask it the same question and paste the answer here. That actually has the potential for harm and would be unwelcome.
Let’s see what AI has to say about that :D copy-pasting OP’s post with some context.
@ChatGPT@lemmings.world programming.dev is an instance that his home to online communities that attracts programmers and other tech enthusiasts.
There are no platform’s rules, guidelines and policies regarding the use of AI-generated content and we would like to weigh the pros and cons.
I wanted to start a discussion about the use of AI-generated solutions on Programming.dev. Personally, I’ve found that AI-powered tools have been incredibly helpful in solving programming questions. I won’t name any specific commercial software, but I use one that combines GPT-4 and web search to get more factual information. I write some answers I think I might revisit to the ShareGPT community, but I would prefer posting programming solutions to this instance. However, I’m not sure if AI-generated solutions are welcomed on programming.dev. I’d love to hear your thoughts on this. If AI-generated responses are accepted, how should we format the answers, should we just copy paste without quoting, should we quote the model, just mention that it’s AI-generated,…?
From an AI perspective, AI-generated solutions can certainly provide valuable insights and ways to efficient problem solving. However, it is crucial to remember that AI should ideally assist human judgment and creativity, not replace it. Therefore, if programming.dev decides to welcome AI-generated solutions, some guidelines might include:
-
Creditation: Explicitly mentioning that the assistance of AI was used in generating the solution. This ensures transparency and helps readers to interpret the content knowing it was assisted by AI.
-
Quoting and Formatting: If quoting a significant part of the AI-generated content, using quotation marks and/or italics to designate that content may be helpful. It helps to distinguish between user input and AI-generated output.
-
Human Review: AI should be used as a tool for help and the final solution should ideally be presented after a human review. It can be beneficial to point out in the post what was the user’s contribution versus the AI’s contribution.
These are just suggestions, the community can have further discussions on the matter. It’s a new field, so it’s likely that the ‘rules’ will evolve over time.
Finally, while AI tools can indeed be helpful in problem-solving, we should ensure they do not discourage individual learning and discovery, especially in an educational field like programming. Preserving a balance between the two is key.
-
Hi there! Looks like you linked to a Lemmy community using a URL instead of its name, which doesn’t work well for people on different instances. Try fixing it like this: !sharegpt@lemmy.fmhy.ml