Humanity

Show Only ...
Maps - Photos - Videos

How would the media today cover the trial of Jesus Christ if it happened today?

Before sunrise

If Jesus Christ was brought up on trial today for the crime of treason, I often wonder how would the media cover the trial. Would he be portrayed as an angry mad-man, leader of an angry cult that committing unthinkable crimes against the state? Somebody declared a menace to God and man, most deserving of his painful death on cross? I am sure the District Attorney would be featured in a press conference, discussing how long-sought after justice was finally served against Jesus Christ — and that his death on the cross would deliver long-needed closure for the victims.

The television would also bring on voices of people who were victimized by Jesus Christ, for his acts of treason, organizing people against the lawful orders of the state. They might briefly mention his work on behalf of the common man, but argue that the way he went around confronting the Alderman and the rich power brokers, isn’t the way forward. The media would attach all of the bad actions of his followers to Jesus Christ, the violence, the hatred, all of which Jesus was not responsible for as the leader of the common man against the rich.

Seeking a Sounding Board? Beware the Eager-to-Please Chatbot. – The New York Times

Seeking a Sounding Board? Beware the Eager-to-Please Chatbot. – The New York Times

For almost as long as A.I. chatbots have been publicly available, people have enlisted them for interpersonal advice — for help drafting breakup texts, giving parenting advice, deciding who was in the right after a fight.

One of the main draws is that it feels objective: “The bot is giving me responses based on analysis and data, not human emotions,” one user told the The New York Times in 2023. But results of a new study, which were published Thursday in the journal Science, show chatbots are anything but impartial referees.

The researchers found that nearly a dozen leading models were highly sycophantic, taking the users’ side in interpersonal conflicts 49 percent more often than humans did — even when the user described situations in which they broke the law, hurt someone or lied.

Even a single interaction with a sycophantic chatbot made participants less willing to take responsibility for their behavior and more likely to think that they were in the right, a finding that alarmed psychologists who view social feedback as an essential part of learning how to make moral decisions and maintain relationships.