element14 Community
element14 Community
    Register Log In
  • Site
  • Search
  • Log In Register
  • About Us
  • Community Hub
    Community Hub
    • What's New on element14
    • Feedback and Support
    • Benefits of Membership
    • Personal Blogs
    • Members Area
    • Achievement Levels
  • Learn
    Learn
    • Ask an Expert
    • eBooks
    • element14 presents
    • Learning Center
    • Tech Spotlight
    • STEM Academy
    • Webinars, Training and Events
    • Learning Groups
  • Technologies
    Technologies
    • 3D Printing
    • FPGA
    • Industrial Automation
    • Internet of Things
    • Power & Energy
    • Sensors
    • Technology Groups
  • Challenges & Projects
    Challenges & Projects
    • Design Challenges
    • element14 presents Projects
    • Project14
    • Arduino Projects
    • Raspberry Pi Projects
    • Project Groups
  • Products
    Products
    • Arduino
    • Avnet Boards Community
    • Dev Tools
    • Manufacturers
    • Multicomp Pro
    • Product Groups
    • Raspberry Pi
    • RoadTests & Reviews
  • Store
    Store
    • Visit Your Store
    • Choose another store...
      • Europe
      •  Austria (German)
      •  Belgium (Dutch, French)
      •  Bulgaria (Bulgarian)
      •  Czech Republic (Czech)
      •  Denmark (Danish)
      •  Estonia (Estonian)
      •  Finland (Finnish)
      •  France (French)
      •  Germany (German)
      •  Hungary (Hungarian)
      •  Ireland
      •  Israel
      •  Italy (Italian)
      •  Latvia (Latvian)
      •  
      •  Lithuania (Lithuanian)
      •  Netherlands (Dutch)
      •  Norway (Norwegian)
      •  Poland (Polish)
      •  Portugal (Portuguese)
      •  Romania (Romanian)
      •  Russia (Russian)
      •  Slovakia (Slovak)
      •  Slovenia (Slovenian)
      •  Spain (Spanish)
      •  Sweden (Swedish)
      •  Switzerland(German, French)
      •  Turkey (Turkish)
      •  United Kingdom
      • Asia Pacific
      •  Australia
      •  China
      •  Hong Kong
      •  India
      •  Korea (Korean)
      •  Malaysia
      •  New Zealand
      •  Philippines
      •  Singapore
      •  Taiwan
      •  Thailand (Thai)
      • Americas
      •  Brazil (Portuguese)
      •  Canada
      •  Mexico (Spanish)
      •  United States
      Can't find the country/region you're looking for? Visit our export site or find a local distributor.
  • Translate
  • Profile
  • Settings
Community Hub
Community Hub
Member's Forum Does AI plagiarize and take credit for work of others ?
  • Blog
  • Forum
  • Documents
  • Quiz
  • Events
  • Leaderboard
  • Polls
  • Files
  • Members
  • Mentions
  • Sub-Groups
  • Tags
  • More
  • Cancel
  • New
Join Community Hub to participate - click to join for free!
Actions
  • Share
  • More
  • Cancel
Forum Thread Details
  • Replies 18 replies
  • Subscribers 532 subscribers
  • Views 733 views
  • Users 0 members are here
Related

Does AI plagiarize and take credit for work of others ?

robogary
robogary 3 months ago

I was surfing the web for advice on remedies for python error "module missing" and then for example code to compare to.

Later I did the same google on my cell phone, which AI wants first shot to answer. It told me " here is code from AI" , which looks exactly like example code from an older webpage. 

I was thinking , Mr AI if you found this code in cyberspace, why didn't you give the originator get credit ? It's not like they asked you for money.

I wonder if AI does the same for recipes. ......

  • Sign in to reply
  • Cancel

Top Replies

  • dang74
    dang74 3 months ago in reply to shabaz +2
    I won't be able to persuade you further... but I leave you with this... "It's unfair to say the student cheated on the entire exam. After all he didn't need to peer over anyone's shoulder when he was writing…
  • BigG
    BigG 3 months ago +1
    I rather like Gemini, and to date have not noticed this sort of thing. But just to be sure, I asked... Not bad for an AI response. https://g.co/gemini/share/6dc95911917d
  • BigG
    BigG 3 months ago in reply to robogary +1
    Ha, I think it's simply learnt how to bluff, by ensuring that it is 100% confident in all the answers given.
Parents
  • shabaz
    shabaz 3 months ago

    In some respects I think it's fair for the user to do that research, i.e. make sure the AI output is valid and functioning, and isn't using anyone's code that may require permission or attribution.

    Personally, I prefer to examine each line of AI-suggested code, to see if I can do it better, or tailor it to my style, and that ends up with some re-write, and guides the AI for subsequent lines of code too.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • obones
    obones 3 months ago in reply to shabaz

    Except that no one is doing this, and I've already seen my own code being output verbatim by Copilot, even with the typo I made at the time.

    Sure, that code is under a permissive license in terms of usage, but that license clearly requires citing me and the other contributors, which is NOT happening here.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • shabaz
    shabaz 3 months ago in reply to obones

    How's that different from pre-AI though? If people want to abuse license terms, they would have done so pre-AI too.

    I think those that respect license terms, will be taking care as they were before. Personally, I do that by not accepting large chunks of code, without going through each line, and guiding the AI on a per-line basis.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • obones
    obones 3 months ago in reply to shabaz

    The main difference is that before AI "pumped" all that code, one had to get to it and it was hardly possible to miss the comment at the top or the LICENSE file describing the conditions. Sure, one could always ignore it and I'm sure it happened.

    But with AI, the code is given without any context, without citing any reference, even sometimes giving the impression this is its own creation "out of the blue".

    I see you have a good discipline about that, but I can assure you this is a rare situation.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • shabaz
    shabaz 3 months ago in reply to obones

    Some hobbyists might just take code perhaps, but I would have thought almost every engineer from junior level upwards will have been trained about not taking source code without considering the license. It's drilled into them because of the liability if orgs use stolen code.

    There are disreputable firms that probably don't care all too much I guess. 

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • dang74
    dang74 3 months ago in reply to shabaz

    If we set the responsibilities of individual users aside for a moment it does seem from examples provided by obones that the AI algorithms are playing fast and loose with intellectual property and stripping out the credits or conditions for using a specific code snippet... and that doesn't quite sit right in my opinion.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • shabaz
    shabaz 3 months ago in reply to dang74

    I agree it's not ideal, I just don't think it's as big an issue as some people think, because engineers still need to consider their sources, and they needed to do that pre-AI too. If it was such a big issue, people wouldn't be using it.

    I acknowledge the limitations, but equally I believe people should and do work around it in a moral way, same as engineers did pre-AI (well, at least the normal engineers; there are the immoral ones like those responsible for emissions scandal, dodgy firms that won't care about laws regardless, and so on).

    Has life become harder for an engineer to discover the sources? Maybe, but that's a decision the engineer has completely under their own control, because they are not obligated to use AI output, or they could choose to restrict their AI usage to (say) tasks that are not going to form part of their product, and their are plenty of those sorts of tasks to benefit from using AI as well.

    Today the responsibility (and liability) lies with the engineer and the firm that accepted the code to a large extent. Would that responsibility change even if AI provided a thorough list of sources? Probably not, for two reasons:

    Firstly, because AI is not 100% accurate, the engineer would be expected to check regardless.

    Secondly, practically speaking, I cannot imagine governments allowing individuals and firms to sue AI companies for unbounded amounts of mistakes, because AI companies cannot be expected to anticipate all repercussions. An analogy is one's mobile phone data connection going offline. If that occurs when someone wanted to make a stock transaction and they lose millions, they could try taking the phone company to court, and the case would get thrown out, because it's completely impractical for commerce if the phone company could be liable for this. Today it's much easier for the user to "do something" to reduce risk, than it is for the AI firm or phone company respectively.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
Reply
  • shabaz
    shabaz 3 months ago in reply to dang74

    I agree it's not ideal, I just don't think it's as big an issue as some people think, because engineers still need to consider their sources, and they needed to do that pre-AI too. If it was such a big issue, people wouldn't be using it.

    I acknowledge the limitations, but equally I believe people should and do work around it in a moral way, same as engineers did pre-AI (well, at least the normal engineers; there are the immoral ones like those responsible for emissions scandal, dodgy firms that won't care about laws regardless, and so on).

    Has life become harder for an engineer to discover the sources? Maybe, but that's a decision the engineer has completely under their own control, because they are not obligated to use AI output, or they could choose to restrict their AI usage to (say) tasks that are not going to form part of their product, and their are plenty of those sorts of tasks to benefit from using AI as well.

    Today the responsibility (and liability) lies with the engineer and the firm that accepted the code to a large extent. Would that responsibility change even if AI provided a thorough list of sources? Probably not, for two reasons:

    Firstly, because AI is not 100% accurate, the engineer would be expected to check regardless.

    Secondly, practically speaking, I cannot imagine governments allowing individuals and firms to sue AI companies for unbounded amounts of mistakes, because AI companies cannot be expected to anticipate all repercussions. An analogy is one's mobile phone data connection going offline. If that occurs when someone wanted to make a stock transaction and they lose millions, they could try taking the phone company to court, and the case would get thrown out, because it's completely impractical for commerce if the phone company could be liable for this. Today it's much easier for the user to "do something" to reduce risk, than it is for the AI firm or phone company respectively.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
Children
  • dang74
    dang74 3 months ago in reply to shabaz

    Although I began my last response asking that we set aside the responsibilities of the individual user aside for a moment, I do agree with you that the user does have a responsibility for due diligence... and yes, I think for the most part big tech shouldn't be liable for any negative outcomes from search results or AI generated code.... but I still don't think they should be let off the hook for violating intellectual property.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • shabaz
    shabaz 3 months ago in reply to dang74

    Some of the arguments that suggest IP violation are unreasonable, because some of the AIs do not regurgitate verbatim; they instead try to generate output that suits your own coding style and function prototypes. I think it's fair game to learn from publicly available content if it's retold in different words/syntax. Humans do that each time they read a copyrighted article in a newspaper and then tell their friends about what's going on. No-one suggests they are violating IP.

    It may well turn out that some AI's will never be able to cite all sources for their generative output. Maybe we need to change our perspective on things, because it's pragmatic that some of that responsibility will still be in the user's hands for now, not AI.

    • Cancel
    • Vote Up +1 Vote Down
    • Sign in to reply
    • Cancel
  • dang74
    dang74 3 months ago in reply to shabaz

    I won't be able to persuade you further... but I leave you with this... "It's unfair to say the student cheated on the entire exam.  After all he didn't need to peer over anyone's shoulder when he was writing his name."

    • Cancel
    • Vote Up +2 Vote Down
    • Sign in to reply
    • Cancel
  • shabaz
    shabaz 3 months ago in reply to dang74

    In your example, the student chose not to copy the name.

    What if AI had printed all names/sources alongside the generated content? A user could still decide to ignore the license terms at those sources. In other words, those who want to plagiarize will do it anyway.

    There really is no difference. Those same people would have done it pre-AI and post-AI, so why the hang-up that a user needs to make some effort to locate the sources if he/she makes the decision to use content?

    The decision and repercussions (and clearly liability!) always rests with them and the firms that accepted the content into their codebase, no-one is suggesting anyone can be excused for getting it wrong, just because AI might have not provided sources (or might have provided incorrect sources).

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
element14 Community

element14 is the first online community specifically for engineers. Connect with your peers and get expert answers to your questions.

  • Members
  • Learn
  • Technologies
  • Challenges & Projects
  • Products
  • Store
  • About Us
  • Feedback & Support
  • FAQs
  • Terms of Use
  • Privacy Policy
  • Legal and Copyright Notices
  • Sitemap
  • Cookies

An Avnet Company © 2025 Premier Farnell Limited. All Rights Reserved.

Premier Farnell Ltd, registered in England and Wales (no 00876412), registered office: Farnell House, Forge Lane, Leeds LS12 2NE.

ICP 备案号 10220084.

Follow element14

  • X
  • Facebook
  • linkedin
  • YouTube