There was no input from any kind of artificial intelligence with this article.
I feel like maybe I should make that disclaimer with everything I write now because everywhere you look it seems that people are leaning on AI to write the words that people used to write.
I get it – it’s easy, and like everyone else, people who write on the web for a living also love anything that makes it easier. But it also sucks and makes for really horrible things to read full of misinformation. To me it’s not worth the trade – been there, done that.
Interestingly enough, at least for me, Google has made the same decision about AI-produced articles. It may not have been for the right reason(s), but like a broken clock, Google gets it right every once in a while.
Google is one of the companies at the forefront of AI (Artificial Intelligence) and LLM (Large Language Model) research and has demonstrated some great tools. AI is used in many parts of your Android phone, as well as a slew of Google services, but the company hasn’t released a comprehensive, consumer-facing AI interface capable of writing theses or blog posts.
The official reason why, according to Alphabet CEO Sundar Pichai and SVP of Google Research Jeff Dean, sounds weird: “the costs if something goes wrong are higher because people have to trust the answers they get from Google.”
Like most everyone else, I took that answer as some kind of code that real means “we haven’t found a way to monetize it and use it for search” and thought Google was blowing smoke. Either way, it’s starting to look like the company made the right choice, regardless of the reasons.
AI does what it is taught and nothing more
AI is neither artificial nor intelligent. It’s software that keeps what it’s given and is programmed how to present it. It can (almost) drive a car, it can identify a cat, and it can write essays that try to pass people off as their own.
There are also some examples of how dangerous AI is like that suggest wood chips make cereals taste better or that broken porcelain adds calcium and essential nutrients when added to baby food. I was turned off when AI obviously plagiarized human authors and ignored any sense of journalistic ethics, but this is insane.
We trust Google to tell us where to find what we’re looking for and not give us bad advice. We probably shouldn’t, but we do.
Imagine if some of that came from Google. We lose our collective minds when the camera app takes three seconds to load, and that’s before we’ve eaten any wood chips that would probably make us even grumpier.
Again: imagine if Google had given misinformation using errors in basic math packaged as a financial advice article. That would have ended with Pichai dragging a wooden cross up a hill when the people had their way. We hate Google, but we also trust Google to tell us where to find what we’re looking for and not give us bad advice. We probably shouldn’t, but we do.
There is no way back
This does not change the fact that AI is quickly replacing the outsourcing and freelance writing industry. That’s unfortunate, but it’s also partly Google’s fault.
In case you don’t know, it’s an industry to find ways to put buzzwords in a title that will make it appear at the top of Google’s search results. Every company that creates Internet content does it, including Android Central and my favorite YouTuber who calls all the broken cars he fixes “abandoned” and “forgotten.” Those words move the video to the top of the search results, just like “best” moves a written article higher in the search results. Those are the words we use to search for “things”, so that’s what we get.
AI is really good at SEO
It’s called Search Engine Optimization and AI is very good at it. The actual content of an article doesn’t matter because once you get there and see the ads, the website can count it as a hit, just as if you spent 10 minutes reading valuable content.
You’ve noticed that Google (and Bing, and Jeeves or whatever other search engine you use) are getting worse results because of SEO. AI will make this worse. It doesn’t have to be that way.
Words written by an AI are easy to determine. You can download the demo from this tool by Edward Tian to learn more about the methodology used, but if one person can write an accurate tool to weed out AI-written content in just a weekend, Google can filter them out of search results. Yet it doesn’t.
There’s probably a reasonable explanation as to why, but we don’t know and frankly I don’t care what it is. Articles written by AI are just bad and I would like never to be linked to them by a search engine.
Google will probably eventually jump on the AI bandwagon for consumer bots, maybe even in 2023. It won’t be pretty, because chances are Google has much more advanced conversational AI than anything we’ve seen so far. It’s going to have to be heavily censored and constantly edited so it doesn’t turn into something that makes Microsoft Tay look tame and comforting.
Then it will have a reason not to filter AI out of search results and our lives forever. 💰