AI’s Different Results Thwart Plagiarism Checkers

Hi gang,

I want to follow-up the Partner Conference Atlanta Practical AI for Instructors: Lessons Learned session from Thursday August 1.

I asked an AI engine to generate content yesterday. One small change in my prompt created a result with a vastly different result, tone and verbiage. Armed with this knowledge, I am arguing plagiarism checkers will not generate consistent results on AI-generated content.

My client has all users logging into their Windows 11 workstations with the same password. Danger, Danger, Danger, I know. I asked perplexity to generate content I can share with the client, promoting creating user-specific passwords.

My first prompt was​

“why should a company with 5 computer users have unique passwords for each user and in under 300 words and include bullet points and in a friendly tone and written from the perspective of a cybersecurity consultant”

The first answer was​

“As a cybersecurity consultant, I'd strongly recommend unique passwords for each user in your 5-person company. Here's why, in a friendly and concise manner:

Enhanced Security: Unique passwords significantly reduce the risk of a widespread breach. If one password is compromised, only that account is at risk, not your entire system.
Accountability: Individual passwords allow you to track who accessed what and when. This is crucial for maintaining data integrity and resolving any issues that may arise.
Compliance: Many industries require unique user identification for regulatory compliance, such as HIPAA in healthcare.
Easier Management: While it might seem simpler to have one password, individual accounts actually make it easier to manage access rights and revoke them when needed.
Personal Responsibility: Users tend to be more careful with their own unique passwords, leading to better overall security practices.
Flexibility: Different access levels can be assigned to different users based on their roles and needs.

To make this manageable, consider:

• Implementing a password manager for your team
• Using multi-factor authentication for an extra layer of security
• Creating a clear password policy that everyone understands and follows Remember, good cybersecurity is about layers of protection. Unique passwords are a simple yet effective layer that can significantly boost your company's digital defenses. It's a small change that can make a big difference in protecting your valuable data and systems.

My second prompt requested only 5 bullet points​

“why should a company with 5 computer users have unique passwords for each user and in under 300 words and include 5 bullet points and in a friendly tone and written from the perspective of a cybersecurity consultant”

The second answer was vastly different​

“Hey there! As a cybersecurity consultant, I'm excited to share why unique passwords for each of your 5 users are a game-changer for your company's security. Let's dive into the top reasons:

Fortress-like Protection: Unique passwords act like individual locks for each user. If one lock gets picked, the others stay secure. This means a single compromised password won't give hackers the keys to your entire kingdom!
Sherlock-Level Tracking: With unique passwords, you can easily track who did what and when. It's like having a digital detective on your team, helping you solve any mysterious activities quickly and efficiently.
Regulation High-Five: Many industry regulations love unique passwords. By implementing them, you're not just boosting security, you're also giving a thumbs up to compliance requirements. It's a win-win!
Personal Security Superheroes: When users have their own passwords, they tend to become more security-conscious. It's like giving each team member their own superhero cape – they'll feel more responsible for protecting their digital identity.
Flexible Access Control: Unique passwords allow you to tailor access levels for each user. It's like having a VIP list for your data – you decide who gets the all-access pass and who gets the standard ticket.

Remember, implementing unique passwords doesn't have to be a headache. Password managers can make this process smooth and simple for everyone.”

Small change​

All I did was request 5 bullet points. The result is vastly different. My proposals do not promise “Fortress-like Protection,” “Sherlock-Level Tracking” and Regulation High-Five.” I may submit the second result to a teacher or academic journal while someone else submits the first result. No teacher could accuse anyone of copying each other. The tone is widely divergent.

I ran the second prompt again this morning. Widely divergent results, again. Seeing that we get widely divergent results, teachers and academic journals cannot generate sufficient evidence for plagiarism. Prompts are highly similar. Results are not. AI wins. Plagiarism checkers lose.

More​

AI ain't so tough. See https://cybersafetynet.net/category/ai/ to help understand and use Artificial intelligence.
 
Last edited:

jarrelrivera

Well-known member
Aug 15, 2019
78
73
9,216
Australia
www.jarrelrivera.com
I agree. There's not one system that can accurately detect if a paper is written by AI or not.
What I do with my classes is I ask the students to use personal tone, with a practical reflection on "this is what I've learned... now, what?" In this way, the students get to put their own thoughts on their paper.
 
  • Like
Reactions: Jill West

Moorosi Mofokeng

New member
Jun 18, 2024
1
1
36
I agree. There's not one system that can accurately detect if a paper is written by AI or not.
What I do with my classes is I ask the students to use personal tone, with a practical reflection on "this is what I've learned... now, what?" In this way, the students get to put their own thoughts on their paper.
That might work. But remember if you build a profile AI can learn your tone and deliver more personalized results. Some models are so powerful that if you teach them how to talk like you then can impersonate you perfectly if not better.
 
  • Like
Reactions: jarrelrivera
That might work. But remember if you build a profile AI can learn your tone and deliver more personalized results. Some models are so powerful that if you teach them how to talk like you then can impersonate you perfectly if not better.
Would an overworked/underpaid instructor do the necessary work to determine the difference between human-generated and AI-generated content? Seems the two writing samples would be incredibly close. :geek:
 
I agree. There's not one system that can accurately detect if a paper is written by AI or not.
What I do with my classes is I ask the students to use personal tone, with a practical reflection on "this is what I've learned... now, what?" In this way, the students get to put their own thoughts on their paper.
Bravo, and thank you for sharing, Jarrel. Everybody should polish AI-generated content to make it more reflective of the person submitting that work.
 

Rick Butler

Well-known member
  • Aug 8, 2019
    2,049
    7
    3,650
    118,421
    Colorado Springs, CO
    www.intellitec.edu
    This is why, in my thinking, that term papers, essays, and written assignments probably should diminish as an assessment of capability, going back to things like oral presentations and hands-on labs to demonstrate skill. A lot of colleges these days make use of discussion boards and the like to determine engagement in a class - that has to go too.

    I did read somewhere (i'll have to find it again) some news this week about systems to detect the use of ChatGPT and OpenAI in written work. But like @jarrelrivera intimated, AI is only going to get smarter and more personalized as we go.

    According to EducationWeek (Prothero, 2024) , up to 80% of students now are admitting to cheating and using AI for written work. This is disturbing. However I don't think we're going to win the battle through detection at this rate, but rather, at changing it so that AI is functionally useless when assessing learning.

    /r
     
    • Like
    Reactions: jarrelrivera
    AI keeps getting better at mimicking our writing styles and it’s becoming harder to trust that students are really putting in the work on their papers and essays. Marks experiment playing with those AI-generated responses proves how tricky it’s getting to catch this stuff with the usual plagiarism checkers. I think we need to mix it up and focus on things that really show a student’s understanding.

    When my students have to do a presentation, I force them to explain things in their own words, which gives me a much better idea of what they really know. And with simulated labs, they have to actually demonstrate their skills in real-time, maybe in front of me, so they just can’t fake. Proctored tests are another great way to make sure the answers come from the student, not some AI tool. Of course I use AI, but mostly to help me solve troubleshooting issues, a routing set of commands that I need, or some server configurations. For that it works wonders!
     
    Last edited:

    abiodun1991

    Well-known member
  • Mar 17, 2022
    370
    94
    10,450
    32
    Nigeria
    minazconsult.org
    I agree with Rick and Jose. Oral presentations may be the ultimate solution. I’d also suggest presentations either in person or a proctored environment like what Pearson Vue enforces for remote test takers.
    I agree with @Rick oral presentation is better
     
    I agree with Rick and Jose. Oral presentations may be the ultimate solution. I’d also suggest presentations either in person or a proctored environment like what Pearson Vue enforces for remote test takers.
    I use Cert Master and Test Out material even for in person classes. It's good to make them run labs and quizzes with the instructor present. You walk around the room to make sure they are not using special aids and no phones are allowed on their desks, they go on their back packs.
     

    Brian Ford

    Well-known member
  • Jul 15, 2021
    87
    155
    9,306
    Flagler Beach, FL
    fordsnotes.com
    I agree with Rick and Jose. Oral presentations may be the ultimate solution. I’d also suggest presentations either in person or a proctored environment like what Pearson Vue enforces for remote test takers.
    First off it was great meeting you in person Mark at the Partner Summit! Thanks for pointing out this great instructor focused presentation. I was a little concerned with ECPI's Instructor John Guise's presentation title but the content was very valuable. John provides real examples of how different prompt's to different AI tools provided different results and dove into some of the tools that could be used to detect student's use of AI to generate content. It was good lessons learned session.