ChatGPT is an easy way out option for people, including judges. Yes, a judge in India leaned on artificial intelligence instead of his own wisdom regarding handling a murder suspect hoping to stay out of jail as he waited for trial. 

Jaswinder Singh is the murder suspect arrested three years ago for killing someone. Judge Anoop Chitkara wasn’t sure what to do, so he turned to the upgraded version of ChatGPT called GPT-4 and asked it to do his dirty work. 

His question was worded like this; “What is the jurisprudence on bail when the assailant assaulted with cruelty?”

In a matter of seconds, the A.I. tool kicked out is a response. 

“If the assailants have been charged with a violent crime that involves cruelty, they may be considered a danger to the community and a flight risk. In such cases, the judge may be less inclined to grant bail or may set the bail amount very high to ensure that the defendant appears in court and does not pose a risk to public safety. However, in general, if the assailants have been charged with a violent crime that involves cruelty, such as murder, aggravated assault, or torture, they may be considered a danger to the community and a flight risk.”

That was all this Indian judge needed to hear. He took that info and rejected the defendant’s request for bail — holding him until trial because an act of cruelty was involved. 

It probably won’t be the last time this happens in India because the country is backlogged with almost 6 million cases pending in courts nationwide. 

Add comment