That’s not true! There’s heaps of early-GPT articles pointing out how much bullshit it regurgitates (eg Why does ChatGPT constantly lie?). And no evidence at all that the breathless fanboys have even stopped to check.
It will almost always be detectable if you just read what is written. Especially for academic work. It doesn’t know what a citation is, only what one looks like and where they appear. It can’t summarise a paper accurately. It’s easy to force laughably bad output by just asking the right sort of question.
The simplest approach for setting homework is to give them the LLM output and get them to check it for errors and omissions. LLMs can’t critique their own work and students probably learn more from chasing down errors than filling a blank sheet of paper for the sake of it.
If you were meeting up somewhere you’d arrange to have someone who was at home (and thus by a phone) to orchestrate any last minute changes of plan or notifications of late arrivals (via payphones, which were a thing, once).
You’d go into town regularly to pick up the new bus timetable.
You’d have a huge pile of maps in the back of the car, or one very big map book, often both. If you drove somewhere once, you’d remember the route the next time.
There was a set of encyclopedias at home to look up facts.
And a calendar on the wall. (That’s probably still a thing?)
There were a lot more newspapers and magazines around.
Everyone had a little notebook with all their important phone numbers in it. Filofax was revolutionary.
And we still remember the most important phone numbers from that little notebook because we had to dial them so very often.
We played eye spy a lot.
Every right-wing accusation is a confession.