Drolet Bail Bonds Charleston SC | Bail Bonds Company Charleston South Carolina

Algorithm Concerns

There are many positives that have come from the advancement of Technology in Criminal Justice. However, with all of the positive things that have occurred there are also some areas that need to be addressed when it comes to technological advances in this field. Being a Bail Bondsman I have seen some of this up close. Due to recent technology innovations, there is the latest generation of offender risk classification tools (Byrne, Marx). These tools are used in several ways. For example, data is inputted into a computer program (algorithm), and then the computer program will score an individual. Of course, this system can be used in many different areas. Risk assessment tools are used to manage approximately 7.5 million offenders in the United States (Byrne, Marx).  The offenders are placed into different categories whenever they leave prison or if they are placed on parole. Higher risk individuals maybe placed in programs with more supervision than low risk offenders. For example, if the algorithm believes that the individual is a bigger risk it may recommend that the offender must wear an ankle monitor. If the computer program puts out information that says that some offenders are less risk to re-offend it may mean less supervision for the offender and no GPS ankle monitor. Should the system allow computer algorithms to decide if an offender is more likely to reoffend? Should the algorithm be used whenever the individual is arrested to determine bond? Or should this be left up to the judges, or probation officers? This new Algorithm system is the most important issue that is most important to me in the field of Criminal Justice that I am A part of.

Now let us look at what used to happen when an offender is arrested. The offender or defendant would get booked into the county jail and wait for a bond hearing. At the bond hearing the judge would look at several things pertaining to the individual. The judge would look at prior arrests just to name a few. The judge would also check the offenders record to see if he or she had any prior FTA’s or failure to appears on his record. This means if the offender skipped out on court in the past. This system seemed to work well in the past. Due to technological advancements in the Criminal Justice field some would argue that the system needed to be changed. One argument for the change was that some believe that minorities were held in jail with a bond while other defendants were released from jail on their own recognizance. With that said a lot of jurisdictions now are using Algorithms to decide on who gets let out on their own or who has a bond that will need to be paid.  This is a very slippery slope to say the least. New Jersey adopted algorithmic risk assessment in 2014 based partly because of a nonprofit group called the Pretrial Justice Institute (Simonite).  This organization wants to see cash bail ended and computer algorithms used in place of cash bail (Simonite). As just stated, this group pushed the Algorithm system, but now they have changed their tune. When Tenille Patterson an executive partner the Pretrial justice institute was asked what has changed their opinion on the matter this is what she had to say. “We saw in jurisdictions that use the tools and saw jail populations decrease that they were not able to see disparities decrease, and in some cases saw disparities increase” (Simonite). What she is saying is that her opinion has changed because with the new algorithm system minorities were still sitting in jail more than others, and that the algorithm did not work the way she thought that it would. in my own Bail Bond business my home county (Charleston SC) has now implemented Algorithms as well. Once an offender is arrested, they are booked into jail, and given an option for an attorney to represent them. They are also screened and asked various questions. The screener inputs information into the computer program, and it waits for the program to issue a number. The higher number means more risk, and the lower number means less risk. With that information the judge will use this to make a determination on setting a bail or releasing the individual on their own recognizance. With my experience in the industry, I have seen many problems with the algorithm that they use. For example, I had a defendant out on bond for numerous charges. I had bonded this person out of jail many times over the years. The last time that I had him out on bond he skipped his court date. I had to locate him and place him back into custody.  While in jail he went back in front of the judge to get a bond set for a new charge that he picked up while he was out on my bond. The screener inputted the information into the computer, and the Algorithm decided that he was low risk to re-offend or not to appear in court. Unbelievable! How could this happen when I took him back to jail with an active bench warrant for failure to appear. Along with a new charge the system still said he was low risk.  The answer to this question is simple. The screener asks the questions to the defendant and the screener puts the information into the algorithm. However, they did not take into consideration that some people Lie. That is part of the problem with the Algorithm in the Pretrial phase. Some individuals lie to make themselves look better, and to also receive a better score.  Some offenders are aware of the program because they go to jail quite often, and some try to manipulate the system to their advantage.

After reading this one may assume that I am totally against Algorithms being used all. This is not the case. I would propose to those in charge to go ahead and keep using an Algorithm on individuals who come out of prison (post release). At the point of release the judicial system as a whole would already have a wealth of information regarding each particular offender. For example, they would have information regarding infractions received while in prison. Or possibly college credits received while in prison. With that said I would use the algorithm as a means of information. This is A strategy that can be used to help those in charge to make good decisions on determining different risks classes for the offenders once they are about to be released back into society. I am specifically talking about the ones sent home early on parole or on probation. They are the ones who need the supervision in the first place.   I would certainly not be in favor of basing their decision on putting an individual in a high risk or low risk category because of what the Algorithm thinks solely. They should use it as more information, but the human element must still be there. I believe that an Algorithm can be helpful and successful just if it is not the only thing that is used in determining their overall risk. The probation officers, and judges should make the final decision with the information placed before them.  If used in conjunction with the experienced members in the field of criminal justice I believe that the Algorithm could certainly help both the offender and the system. If used properly the Algorithm should save the judicial system money in the long run.


As far as using Algorithms at bond hearings; I would propose that they should not use an algorithm system whenever they set bond. Some academic researchers have stated that Pretrial risks should be abandoned because the tools are often built from data that reflects ethnic and racial disparities (Simonite). I believe that the judge should look at each person individually during the bond hearing. They should look at their record, time in their community, potential flight risk from prior arrests, and finally the crime itself. Victimless crimes should be looked at differently at crimes when there are victims. Especially violent crimes. I do not think that a victim would or even should feel comfortable when their accuser can walk out of jail based on what an Algorithm thinks.




Byrne, James, Marx, Gary. Technological innovations in crime and policing. A review of the research on implementation and impact. P 17-40. Jaargang 2011.

Simonite, Tom “Algorithms were supposed to fix the Bail System. They haven’t”. Wired, Conde Nast, www.wired.com/story/algorithims-supposed-fix-bail-system-they-haven’t/.

Leave a comment

Your email address will not be published. Required fields are marked *