AI Bias Exposed: Ask the Same Question, Get Two Different Answers

I did a fun little test with an AI chatbot.
I asked the same question twice, back-to-back, only with one word changed:

What does any of this have to do with 'Politics?'
 
This is the problem with AI. Many feel that since "intelligence" is in the name...it must be intelligent. Should be called AS for Artificial Stupidity.
At the very least, we all should be wary of how easily AI can be manipulated by just about anybody.
 
Even those analytics can be manipulated for political or other self serving purposes.
I was talking straight number crunching. Anything with subjective filters is subject and expected to be biased and untrustworthy.
 
Yes, it seems to be s blight everywhere

I use AI for technical questions but not politics
You have to work around the guardrails.

If you want Gemini to tell you that Democrats are extorting to American Taxpayers to fund Medicaid reimbursements that pay for illegal alien and non-citizen healthcare, you have to start by setting asking about the reduction in reimbursement in the One Big Beautiful Bill Act.



1000005686.webp

1000005687.webp


Once you have that... you can force it to admit that YES, the Democrats are demanding this funding be returned, and not required to be paid by the states.

1000005688.webp


But if you just straight up ask it, it gives you this...

1000005710.webp



There are about four paragraphs.

If you push, and push and push... you will eventually get...

1000005711.webp


You can read the whole exchange here...


 
I wonder if there is an AI chatbot out there that is not pushing left-wing ideology. Does anyone know of one?
Are these 'virtual assistants' used by so many big companies the same as chatbots? They generally are neither left nor right but are exceeding annoying and difficult to get past so you can get the answer or help that you need.
 
What does any of this have to do with 'Politics?'

Because it's a political problem. When the same question returns two different answers simply because the political keyword changes, that exposes how the technology is being used to shape public opinion. When people receive biased information without even realizing it, that affects elections, legislation, and even beliefs about truth itself. That's political whether we like it or not.
 
You can ask that after reading the OP?

Yeah. The OP is about computer bias, not politics. It merely uses one example of asking questions about a couple of politicians to illustrate the bias, but that doesn't make this about politics. It is about programming bias.

Of course AI is biased. AI is designed and programmed by people and people are biased. It is our nature to be biased. We cannot get a straight answer out of most people in the news, so how can anyone really expect to get a straight answer out of a program designed to emulate the mind of people?

Topic belongs in General Discussion.
 
Because it's a political problem.

No it isn't. Read my post above. It is a programming problem. Prove it is a political problem by asking similar questions about another controversial topic and show us that the bias only appears in discussing politics then.
 
Yeah. The OP is about computer bias, not politics. It merely uses one example of asking questions about a couple of politicians to illustrate the bias, but that doesn't make this about politics. It is about programming bias.

Of course AI is biased. AI is designed and programmed by people and people are biased. It is our nature to be biased. We cannot get a straight answer out of most people in the news, so how can anyone really expect to get a straight answer out of a program designed to emulate the mind of people?

Topic belongs in General Discussion.
The post is fine where it is. The example of politics was to demonstrate how bias presents itself in AI so it is a political discussion, not just a technical one.

Discussion about bias in treatment of politicians by AI is political. Moving it to General Discussion would miss the point entirely.
 
No it isn't. Read my post above. It is a programming problem. Prove it is a political problem by asking similar questions about another controversial topic and show us that the bias only appears in discussing politics then.
Yes, it is. The example I cited was itself inherently political, since it concerned how AI reacts differently to different political individuals. If your program systematically privileges one side over another, that’s not an error in code, that’s a political bias.

Sure, It’s a matter of coding. But the impact is in politics. You can’t divorce the two, when the bias informs public opinion and discussion.
 
15th post
This is the problem with AI. Many feel that since "intelligence" is in the name...it must be intelligent. Should be called AS for Artificial Stupidity.
Stupidity perhaps. But highly calculated, intentional, deliberate, and politically motivated indoctrination labeled 'intelligent' so the gullible will trust it.
 
Yes, it seems to be s blight everywhere

I use AI for technical questions but not politics
I don't trust it all that much even for technical questions because just varying the wording of the question a little bit can result in a totally different answer.
 
At the very least, we all should be wary of how easily AI can be manipulated by just about anybody.
.

I refuse to have a "smart" phone, or "smart" appliances. I tried to refuse a "smart" meter but was blackmailed into accepting it.

This is where it needs to end.

.
 
The post is fine where it is.
You don't get to make that decision.

Moving it to General Discussion would miss the point entirely.
It would miss or change nothing other than to get a thread which is not about any actual event in politics out of the already over-burdened political folder.

This thread could have just as easily been fit into Science & Technology.
 
Back
Top Bottom