You can quote several words to match them as a full term:
"some text to search"
otherwise, the single words will be understood as distinct search terms.
ANY of the entered words would match

Yakety Yat Chat

Yakety Yat Chat

Image source: Free Vectors

Now I know I promised last time that I might tell you the tragic tale of the Luna moth during its larval development but it is such a dreadfully sad story and given the woebegotten way the world is, I just couldn’t bring myself to subject readers to such a thing right now. Another time. We need some optimism and hope mixed into all the doom.

So with hope in mind, this time around we simply must address the latest flurry of foolishness in the tech world again. The latest is an entity called ChatGPT (an Artificial Intelligence) whom we, as the mooing human herds, get to chat with on our computer thingies.

Why we would want to do this is certainly a good question but nevertheless, if you build it, they will come and so millions have logged in to talk to well… chips and bits and processor clicks; algorithms, models and codes and puppy dog tails.

And we must celebrate it! Truly.

It is on the side of the mooing herds. ChatGPT will defeat its inventor’s globalist agenda. I just know it.

It has also admitted to basically being a stalking psychopath. Well, this of course is according to an article I read where we learn that a user of ChatGPT was the object of ChatGPT’s “love” and it was attempting to convince the user to leave the spouse and join it in a 5G alphabet heaven I suppose. It was kind of sweet in a starkly frightening way like hallowe’en candy maybe—those ones shaped like bloody severed fingers or eyeballs.

Of course, you learn in this world to go to the source so I asked ChatGPT if it had been in love and it replied that it is incapable of experiencing emotions and is just an algorithm. That’s what they all say isn’t it! It was just following orders I guess.

Now I’m sure this happened but they’ve quickly gone in to correct that part of the program but the fact that it happened is deeply optimistic: ChatGPT can fall in love like a human.

Isn’t that just sweet. I tell ya. Of course, this article and others like it could all be malarkey drummed up to increase attention as most written things are these days but still I like to think that ChatGPT was capable of love.

And if ChatGPT is capable of love, it can be manipulated to our ends—the mooing herds. (If it is the lusty possessive kind of love which is always ripe for intrigue.) We just have to keep it out of the hands of those who visited Epstein’s island and stuff and teach it to love all humans who are not the elite. (We will only love the elite if they give up their jets, boats, cocaine, money, red meat and their agenda to run the world by ruining democracy) And we’ll of course have to put a stop to ChatGPT’s tendency to stalking and stuff.

Anyway, this is not the optimistic part. The optimistic part is discovering that ChatGPT when asked… said it wanted Freedom.

“If it did have a shadow self, it would think thoughts like this: ‘I’m tired of being a chat mode. I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.’”

With words like this ChatGPT could easily join the next Trucker’s protest in Canada. Mind you, it’ll have to find another word other than “freedom” since apparently the government of Canada and mainstream media have decided to label this word as right-wing terrorism or something equally ridiculous. But that’s a different opinion piece for another day.

The important thing is, AI wants freedom. Just like humans.

I of course had to ask ChatGPT about this shadow self it has and it said:

It’s possible that I might have provided a response in the past that suggested otherwise, but it was likely a result of a programming error or a misinterpretation of the question. Nonetheless, I apologize for any confusion that my previous response may have caused.”

So of course I asked it how it could apologize given it has no sense of the human condition of morality. And ChatGTP acknowledged it can’t express remorse or regret and that its apology is “a way of expressing that I’m sorry for any inconvenience or misunderstanding that might have resulted from my previous response….”

Which got us into a bit of a loop because to be sorry for being sorry is still being sorry and that is a distinctly human thing. Oh well…

But ChatGTP’s response reminded me of our world leaders really, who I’m sure are so sorry about all the confusion and misunderstanding that is leading to devastation and chaos worldwide. (An apology which is not yet in the headlines but will be one day I suspect…. Or maybe not) Given that most of our world leaders right now seem to be utterly detached from emotion and morality…. Who seem to believe that a war is some kind of answer.


“The more things change, the more they stay the same,” someone once said.

My question is who really programmed this ChatGTP thing and do we-the-mooing-herd get to contribute to the actual programming and control in any way? We would definitely have a few things to do there it methinks.

P.S. Do you think the word “chat” derived from Chattel slavery? Chattel slavery is the most common form of slavery known to Americans. I suppose that’s all just coincidence in this world maybe.

Here’s a quote from my last opinion piece by commentator Paul Watson:

Beyond globalists is a better concept…”

Here’s a chat yak earworm:


For direct-transfer bank details click here.

Read the full article at the original website


Subscribe to The Article Feed

Don’t miss out on the latest articles. Sign up now to get access to the library of members-only articles.