Artificial Intelligence, University, and Plagiarism

I, too, have an experience to share regarding AI!

TLDR: The excitement around how amazing AI has become in replicating human creativity has various negative impacts, too. Here is but one: plagiarism detection.

I am coming up to the halfway mark of my doctoral studies. During my previous course, urged on by my professor, I started using Grammarly for the first time.

Grammarly suggestions have been an amazing tool to help me be more conscious of sentence structure and word choice while writing my papers.

However, to date, when writing, I *always* hand-review web version recommendations and manually change suggestions in an offline copy, as I still have a deep-rooted fear that I will be accused of plagiarism, and I have no idea what type of “fingerprint” AI leaves on my papers.

In pondering that question, for the first time today, I installed the Grammarly Mac app, and I am using it right now to help correct my punctuation and recommend better sentence structures as I post.

The genesis of this post: a few minutes ago, I decided to open the newly installed Grammarly add-in and ask it to detect plagiarism for the paper I began writing by hand, from my head. See the attached image below.

Apparently, 50% of the ideas that came from my brain, down the electrons and muscles of my fingers, and onto the digital paper of Microsoft Word show signs of being plagiarized through the use of Artificial Intelligence.

So, I begin to think: what if a university, driven by the societal pressures of adoption, picks an arbitrary KRI/KPI (we do this ALL the time in the real world) to say that if AI thinks 60% of my paper is AI, I will get accused of plagiarism?

Should I go back to not using any form of AI so I can in good conscience declare that I *never* use AI, or do I move forward and hope that universities and professors are themselves learning about the flaws and limits of AI even amongst all the amazing new benefits?

The irony is not lost on me that Grammarly is telling me I should rewrite much of this post for simplicity and clarity. When I look at the suggestions, it is probably correct.

However, my version sounds more “Human,” even if more flawed, and perhaps, just perhaps, by ignoring those great suggestions, my post will display fewer signs of the fingerprint of AI.

[NOTE: Grammarly reminded me of the sometimes hotly-debated statement that the comma goes inside the quote, not outside!].

Soon, I may find that I need to dance the dance with Grammarly just to change my human-written paper to avoid the appearance of plagiarism to those professors and universities who are, like everyone else, struggling to understand where AI fits into this brave new world. I can see a vicious cycle.

What tangled webs we weave.

p.s. Apparently, Grammarly originally thought there was 14% plagiarism in this post. I changed a few sentences, and now it says 0%. Maybe it is so low because I left all my poorly structured sentences in place. 😂

On vacation again… 8 bit computer time!

https://eater.net/8bit

I’ve been staring at Ben’s videos for the last 6 or so months and really wanted to dig a little deeper.. however, with as much knowledge and experience as I have with technology, I really don’t (didn’t..) understand electricity at all, except for the experiences I’ve had with accidentally grounding out light switches in the gang box, because I didn’t realize the electrician had sourced multiple circuits into the same box.. *poof*

What I did know, however, is that electricity is dangerous, and while I had a basic level of understanding electricity, and a pretty solid understanding of how digital circuits work, and how a computer is pieced together from thise circuits, I had no idea how actual electricity flows through all the different components to build up to a functional computer.

For example: what is a transistor, how does a resistor work, what about a diode, or a FET, how do capacitors work… how about electrical flow in circuits that are parallel, or in series, how do you “size” your circuits correctly?

I understood high-level differences with AC and DC, but how do you convert from one to the other, and why use either? The list goes on and on and on.

So, I started with an 11 hour course on Udemy from Ian Juby, on Electricity and Robotics, then I spent hours and hours and hours on youtube watching follow up videos, and demonstrations around Ohms law, impedance and the likes.

Now, I finally feel ready to start on my 8 bit computer from Ben, and confident I won’t electrocute myself. I’ve even started soldering components for the first time on my life – the first time was pretty ugly, but I quickly leveled-up.

First time attempt at Soldering
Initial Attempt at Soldering resistors on LEDs
Soldering after some practice
After a few times soldering – it’s not so bad!

I’ve also chosen to use lead-free solder; I don’t spend all my free time trying to boost my brain capacity just to smear lead all over my hands, which means I’ve had to run my soldering iron at 750 degrees!

Anyway, I’ve got a pretty good start on the 8-bit; I’m likely to spend most of my vacation working on it.

Beginnings of an 8 Bit Computer

Some things never change – this is what my vacations usually look like:

How I take Vacation