16 Comments

GIGO + algorithm = useless

Expand full comment

Every time I see a story about someone freaking out over AI I always think about that exactly: GIGO. Adding in the algorithm to make it useless sums it up perfectly. (I was not trying for the pun there; nailed it anyway)

Expand full comment

I see what you did there!!! ;)

Expand full comment

You provided a great service to humanity, Frank. We all needed a good laugh…

Expand full comment

Huh, I didn't know that was you. Is this Frank's Axe Cop moment?

Expand full comment

All my friends are super heroes. My friend Frank's power is super-skepticism.

Expand full comment

I assume the moon-nuking will commence any day now.

Expand full comment

Is there any IA trained to look at and draw conclusions from source data and documents? Is there any AI where I could input “What did the March 28, 2020 models predict as the number of North Carolinians that would be hospitalized with Covid on April 17th 2020 and what was the actual who were hospitalized” and get the correct answer - The model most widely used was revised downwards to 8,000-10,000 people predicted to be hospitalized with Covid in North Carolina on April 17, 2020. On the date the state dashboard said 812 were actually hospitalized.

If not, why not? Why isn’t it easier to point AI towards actual data rather than it making up data?

Expand full comment

If you have data, some have a large enough context window to input it. Copilot can use searches.

Expand full comment

Can it differentiate between some talking head making a fake data claim and an authoritative source or dashboard? I guess I ask because when I first got out of undergrad/ grad circa 2005/2006 and I used Google all the time, despite my access to expensive research software. At the time I did mid-size multinational tax planning ($250million-~$1billion). The big four tax guides, tax foundation, tax policy center always came up and easy-peasy here is, say, the withholding rate on interest from Costa Rico. By 2015/16 I’d get all this crap, which was irrelevant/ wrong, from WSJ, Forbes, Bloomberg 99% of the time for basically the same type of questions. It seems it was easier to identify relevancy and closet to the source (big 4 cite source) when the algorithm was simpler and more straight forward. It’s gotten worse since then.

I asked about models because I wanted to teach my kids, then 6 and 8, about line graphs at the start of Covid insanity so I graphed the models then we’d check the state dashboard for accuracy. I inadvertently gave them a lesson in propaganda too. Still, it seems AI should easily be able to create graphs like that. It’s simple. A 1st grader can be taught it easily. It took 3 minutes a day in real time. It’d take 20 minutes today. Yet I can’t get AI to do anything useful beyond creating a laugh. It seems like it would be easier to program and train a more objective language model as opposed to these chatbots that just make crap up or regurgitate the same dozen canned answers where Elon Musk ends up bad like Hitler. 🤦‍♀️

Expand full comment

Try Gab's AI.

Expand full comment

Did you see the post where someone got Gemini to reveal what it adds to prompts before they get processed? It's very enlightening. They didn't just put a thumb on the scale, they outright sat on the scale and then invited a hippo to sit on their laps.

https://twitter.com/jconorgrogan/status/1760515910157078931?t=kYjdDfkkjo0wG6FWacaXWg&s=19

Expand full comment

What annoys me most (or amuses me, I can't quite decide) is how Gemini scolds the user for having right-wing thoughts. It's so obnoxious and preachy.

Expand full comment

Be careful. I have caught Bing lying to me (and it copped and attitude when I called it on that). I'm assuming Copilot will do the same.

Expand full comment

Yes, they all halucinate.

Expand full comment

Copilot does. I tried to get it to create an image of a person wearing a deep cowled black cloak standing over an unconscious woman and it wouldn't do it. Some bullshit about not being able to do it.

Expand full comment