People working within the criminal justice system readily perceive that judges are busy people, with built-in job leisure not being something that routinely features in their typical work day.
Given their hectic schedules and sobering work responsibilities, many judges willingly embrace whatever tools come their way that enhance productivity. They are obviously inclined to do so when, as noted in a recent national article focused on criminal justice, such tools help them “improve on the accuracy of human decision-making that allows for a better allocation of resources.”
Increasingly, we hear progressively more about the tech-assist program/process that is regarded with high utility by law enforcers and prosecutors across the country. Essentially, an algorithm is a computer-based set of steps and calculations programmed to solve problems.
If that sounds like it entails complexity, well, you know it does. In fact, some algorithms are so dense and complicated that explaining them in even rudimentary fashion can seem an insuperable task. In a recent article penned for the technology-focused publication Wired, reference is made to “opaque neural networks” that are centrally defined by their lack of transparency and an almost chameleon-like nature that defies comprehension for their users and makes meaningful oversight in many instances a flat impossibility.
And the primary reason for that is this: With each successive tweaking, select algorithms become ever more like the human brain. A neural network, notes Wired, “creates connections on its own.”
That is starkly worrisome in the criminal law realm, states Wired, because — as noted above — courts are increasingly relying upon algorithm-generate data to “inform [their] decisions about bail, sentencing and parole.” Such information materially affects defendants facing a wide range of drug-related charges, sex offenses and other matters.
There is an ethical and legal conundrum intimately linked with that, which we will focus upon in a future post.