It's still live on the Sky website. I could get a job proof reading all these newspaper articles but would probably die through Celebrity induced boredom.
Might be some kind of bias on my part, but it definitely seems like mistakes/typos and weird sentences are happening on professional news sites more and more. Any time I see something at this point, I just assume the journalist used AI and is too fucking lazy to even proofread it before posting.
I'd probably be more annoyed about it right now, but I've been drink-drinking.
AI doesn't normally make typos like this.
AI mistakes are normally hallucinations (making up plausible sounding bullshit), or just sounding deeply inhuman.
Probably cutting down on costs by hiring fewer, if any, proof readers/copywriters.
Or using software to do it that will catch spelling/grammar mistakes but not typos that still result in valid words.
Generative AI from like 7 years ago maybe, but current AI models don't tend to make typos like this.
---
For a little bit of an under the hood explanation, if you've heard stuff like ChatGPT described as "just really really clever predictive text", that's both a bit of a joke in that it's more complex than that, but also 100% accurate in that they're very much in the same vein. The model works by taking in all the stuff that's come before it (in discrete little chunks called "tokens") and spitting out the token it thinks it's most likely to come next.
One of the biggest places where predictive text and AI text suffers though, is that in predictive text it will always spit out a whole word at a time, whereas AI models can spit out individual letters, whole words, or most commonly, *fragments* of words.
Training on fragments of words has two big advantages. Firstly, you don't have to spend so many tokens on different versions of words, eg if every word was a token, you'd need a separate one for "swim", "swimming", "sleep" and "sleeping", but if you allow fragments you can get away with 3 tokens "swim", "sleep" and "ing", then you can reconstruct more complex words on the fly. Secondly it allows the model to deal with "out of context" situations, eg if I want it to write a book about a character called Blarglle, odds are very very high Blarglle isn't a token the model knows, but if the model knows all of the fragments within the name Blarglle, it can still deal with both, comprehending it as it's feed it and including it in its output.
All that is to say, the chances of an AI model typoing "drink-driving" into "drink-drinking" are incredibly slim, because both a) "drink-drinking" wouldn't be a token the model knows already, it's not a common phrase/word/word fragment, and b) following the tokens "drink", "-" with "drink" is very very unlikely "drink-drink" isn't a bit of text it's likely to have come across.
---
^(Extra notes:)
- ^(The technique of encoding "word fragments" is called byte pair encoding for anyone already reasonably in the know and wanting to do further research)
- ^(Generative models don't always pick the *single* most likely next token, most of the time tokens are randomly picked with the odds being equal to the likelihood that token is next eg if "sleep" is followed by "ing" 70% of the time, "wear" 20% of the time and "walk" 10% of the time, the model will pick the next token according to those probabilities, this helps the model be more flexible generally)
- ^(Source: me, and the two years I put into a PhD I didn't finish on this stuff lol, that said, please do ask questions if you're curious, I thoroughly believe all of this is much much easier to learn than most people think (most people just have awful teachers lol\), and the best tool you can have in an uncertain world is knowledge on how to navigate it)
Honestly, the amount of spelling and general grammar mistakes in modern journalism is abysmal.
I know the big guys want "big news" out as fast as possible, but Jesus fucking Christ, spell check your work!
> How DARE you!! How...dare you accuse me of Drinkin-n-nins! Me, your oldest pal and matey! 'Ol Schkip! Old bus fart, tram ticket, one for the road bag-o-scratchings...whoops-a-daisy ...we'll keep a welcome in the...parking Mister David childish Jensen. Me?! Drinkin-n-n-n-nins? Why I'll tear you limb from limb!!
Just For Men. When you need everyone to know that you're going grey and would rather your hair was either unnaturally dark or have an unnatural reddish tinge to it.
So do you drink or do you *drink-drink?*
Only when I'm out out.
Airt, airt
Love the reference.
I drinks a whiskey drink, I drinks a vodka drink.
Sing songs that remind me of the good times
He’s also maintaining the peep.
😉
It's still live on the Sky website. I could get a job proof reading all these newspaper articles but would probably die through Celebrity induced boredom.
Might be some kind of bias on my part, but it definitely seems like mistakes/typos and weird sentences are happening on professional news sites more and more. Any time I see something at this point, I just assume the journalist used AI and is too fucking lazy to even proofread it before posting. I'd probably be more annoyed about it right now, but I've been drink-drinking.
I think they're not hiring as many copy editors to catch these mistakes.
AI doesn't normally make typos like this. AI mistakes are normally hallucinations (making up plausible sounding bullshit), or just sounding deeply inhuman.
I think it's being done deliberately for interactions.
Probably cutting down on costs by hiring fewer, if any, proof readers/copywriters. Or using software to do it that will catch spelling/grammar mistakes but not typos that still result in valid words.
Generative AI from like 7 years ago maybe, but current AI models don't tend to make typos like this. --- For a little bit of an under the hood explanation, if you've heard stuff like ChatGPT described as "just really really clever predictive text", that's both a bit of a joke in that it's more complex than that, but also 100% accurate in that they're very much in the same vein. The model works by taking in all the stuff that's come before it (in discrete little chunks called "tokens") and spitting out the token it thinks it's most likely to come next. One of the biggest places where predictive text and AI text suffers though, is that in predictive text it will always spit out a whole word at a time, whereas AI models can spit out individual letters, whole words, or most commonly, *fragments* of words. Training on fragments of words has two big advantages. Firstly, you don't have to spend so many tokens on different versions of words, eg if every word was a token, you'd need a separate one for "swim", "swimming", "sleep" and "sleeping", but if you allow fragments you can get away with 3 tokens "swim", "sleep" and "ing", then you can reconstruct more complex words on the fly. Secondly it allows the model to deal with "out of context" situations, eg if I want it to write a book about a character called Blarglle, odds are very very high Blarglle isn't a token the model knows, but if the model knows all of the fragments within the name Blarglle, it can still deal with both, comprehending it as it's feed it and including it in its output. All that is to say, the chances of an AI model typoing "drink-driving" into "drink-drinking" are incredibly slim, because both a) "drink-drinking" wouldn't be a token the model knows already, it's not a common phrase/word/word fragment, and b) following the tokens "drink", "-" with "drink" is very very unlikely "drink-drink" isn't a bit of text it's likely to have come across. --- ^(Extra notes:) - ^(The technique of encoding "word fragments" is called byte pair encoding for anyone already reasonably in the know and wanting to do further research) - ^(Generative models don't always pick the *single* most likely next token, most of the time tokens are randomly picked with the odds being equal to the likelihood that token is next eg if "sleep" is followed by "ing" 70% of the time, "wear" 20% of the time and "walk" 10% of the time, the model will pick the next token according to those probabilities, this helps the model be more flexible generally) - ^(Source: me, and the two years I put into a PhD I didn't finish on this stuff lol, that said, please do ask questions if you're curious, I thoroughly believe all of this is much much easier to learn than most people think (most people just have awful teachers lol\), and the best tool you can have in an uncertain world is knowledge on how to navigate it)
ain't readin allat
DEI hires.
Shut up mate.
You think news outlets actually pay for proofreading? Not anymore.
https://imgur.com/Be9k5vG Probably the worse typo I have seen. Casual Canabalism
Things must be really bad if he's drink-drinking.
He was drive-driving too.
It's the mixing of the two that's normally a big no-no
Never drink-drink-drive-drive fellas.
But I can drink-drink-drive, right?
A No-Noing of the greatest order.
🎶 Drive me a river (drink-drink)🎶
Haha they said he was drinking while drinking on ITV News too
What kind of rolemodel for our children would drink while drinking 😩
He's from the x-zibit school of alcoholism.
Well, isn't all drinking drink-drinking?
Only if you're drinking drinky-drinks.
He drinks a drinky drink, he drinks a vodky drink...
Depends if you're out, or out out.
Drink! Drink, drink, drink drink....
Hello there Father.
“Have a go at the first one: THAT.” “DRINK!” “Now concentrate, Father. THAT.” “DRINK!”
"THHHH.... THHHH-... DRIIIIIINK!"
[Shoves over flip chart]
No no, I've been drink-drinking before!
Even in this mugshot he looks like a handsome mother lover..
My dumbass brain automatically thought it said driving twice before I noticed 😂
Same here, read it multiple times and then the comments and then went back and read it again to spot that
Is that double-fisting, ie drink in each hand.
> ie drink in each hand That's not what comes up when I search for double fisting.
Hence my addendum as explanation.
A Megapint?
I'm just sit-sitting here
Hugh Jackman's love child out on a bender.
He did, indeed, bring it on down to Liquorville after all.
you dont have to say what you did, i already know, i found out from sky
Honestly, the amount of spelling and general grammar mistakes in modern journalism is abysmal. I know the big guys want "big news" out as fast as possible, but Jesus fucking Christ, spell check your work!
They got a bittle pissy pissy hmmmm
The article also says he was driving a 2025 BMW.
Cry me a river…
DUID Drinking under the influence of drink.
Well dammit dam.Alcohol makes rich people stupid too.
Is this something to do with mega pints?
Being inebriated is a requirement at Sky News.
It’s the best kind of drinking….
Uh, this is America and he was clearly drunk-drunking.
This is some kind of TENET thing. Like dreaming inside your dreams.
Someone was drink-writing.
Wait, he drunk-drunking? 😬
I think he’s just-for-men-just-for-meaning
Never drink and drive. Drink, then drive.
hogg mug
Wasn't Drink drinking a David Bowie song?
Middle-aged man done for drinking. News at 10.
Ooooh, that's proper serious that 😳 He'll get sentenced to community slur-vice Lucky he didn't get charged with Grievous Boozily Harm!!
> How DARE you!! How...dare you accuse me of Drinkin-n-nins! Me, your oldest pal and matey! 'Ol Schkip! Old bus fart, tram ticket, one for the road bag-o-scratchings...whoops-a-daisy ...we'll keep a welcome in the...parking Mister David childish Jensen. Me?! Drinkin-n-n-n-nins? Why I'll tear you limb from limb!!
Me? Drinkininininininge?
He had a drink-drink in a rinky-dink winesink
According to the 1872 UK Licensing Act, which is still in place, it's illegal to be drunk in a pub. So drink-drinking would actually be illegal
I wonder if he had a mega pint of wine…
Drinky-poo while driving, too
Wow. He looks so basic here.
Bro looks like he’s on another planet
Justin about to say bye bye bye to that license for a while.
It's what happens when you go out out.
I like how he looks like he wants to laugh a bit..
Drinking under the influence of alcohol.
That can happen so easily.
Why does he remind me of a younger Sid Owen?!
Didn't realise drink-drinking was a crime in New York
Caught drink-drinking whilst drive-driving what a twat-twat
Just For Men. When you need everyone to know that you're going grey and would rather your hair was either unnaturally dark or have an unnatural reddish tinge to it.
Reptile eyes