I was on a call with my business partner, Ryan Santschi. A Zoom call.
“Hey, I’d love to stay and chat, but I have a column to write this morning,” I said.
“Okay. No problem. What are you going to be writing about?” asked Santschi.
Read Also

Beef demand drives cattle and beef markets higher
Prices for beef cattle continue to be strong across the beef value chain, although feedlot profitability could be challenging by the end of 2025, analyst Jerry Klassen says.
I gave him the elevator pitch about what I was planning to cover in my column. He asked a few more clarifying questions. I could hear/see that he was working on something. He was typing. This is not especially interesting, as we’re all guilty of multitasking while on Zoom calls.
“How many words are your columns, generally?”
Okay. His level of interest in my writing was starting to pique my interest. What was he up to?
No longer than a minute into this peculiar and out-of-character exchange about my writing topic, he sends me a text file. It was an 800-word column anchored on the brief description I gave him.
He told an artificial intelligence-powered content generator to write an 800-word article on the topic I gave him. It did so in under one minute and that is what he sent to me. That was what he was doing behind the scenes.
The article was good. It was clean. It was well written. It was relevant to a Canadian ag audience and it included considerations that showed real insight into the ag industry. I could have submitted that, and I am not entirely certain my editor would have noticed that a computer wrote it.
To add to the strange nightmarish reality AI represents, I asked Santschi if he could ask the software to re-write the same piece in the style of Toban Dyck. It did. And it did a scary good job of it.
If you know me, you know that I enjoy technology, from actual pieces of gadgetry to its philosophical/societal implications.
The natural response to this kind of technology is fear over its ability to replace humans. Why would anyone hire me as a writer if algorithms can scour all the words, all the information, and all the sentence structures that exist on the internet and generate quality content?
My response to this is nuanced. I believe that for some things, some projects, it may make more sense to solicit the service of bots. And I think that’s okay. For a business owner to use AI to generate website copy or a LinkedIn post is perfectly acceptable in some instances. There are like a million caveats I want to put on this AI endorsement, but for the sake of time, I’ll just say use it as long as doing so is within the bounds of reason.
In February, I spoke at the Ag Awareness Summit in Saskatoon. On my way there, while waiting in the airport, I got AI to write me an ag policy recommendation related to environmental sustainability.
Not only did it write something that seemed to be new (as in, it generated the idea), it did so in the correct format, including a reference page that included reputable research documents.
When I spoke about this at the conference, a professor, who spoke with me after my presentation, said that there are already programs being used by universities that can detect whether or not student submissions were created using artificial intelligence.
It doesn’t scare me to think of a computer being able to write as well as I can. Not at all. More so, I believe that being afraid of this kind of technology instead of taking the time to develop a deeper understanding of it is tantamount to bowing out on a trajectory agriculture and the rest of society has been on for quite some time.
AI-produced articles of passable quality may seem very now and like a bit too much to handle, but the writing has been on the wall for some time.
People anthropomorphize things. We tend to talk about things in uniquely human ways, because we are human and we associate the creation of an article or column with the act of writing. When a computer generates an article, we say it wrote said article, bestowing upon it human characteristics. This, in turn, terrifies us, even though we are the ones attributing those values to it.
AI does not write things in the human sense. The processes used by AI software to create products, such as articles, can and should be explained in terms of what they actually are. I don’t know what they are, but I am guessing they are complex and linear in ways that are characteristically not human, like, say, a series of beeps and boops that only make sense to a handful of engineers.
Perhaps the human brain and what it generates can be fully explained in much the same way complex computer systems can, but that’s not for this column to debate.
For now, as I have said many times — and will continue to — we should do our best to make sense of these things instead of letting them terrify us.
There is a lot of technology coming down the pipe for agriculture. Some of it will be good. Some of it will be ineffective, and most of it will need tweaking. The industry will rely upon those bold enough to understand instead of run to be its technology gatekeepers, whistleblowers and judges.
Besides, I am terrible at editing my own work, so if a computer can write my first draft in the style of Toban Dyck, then perhaps I can dive in and make it better than something I would have been able to write on my own.