Although ChatGPT promises wide-ranging applications, there is a concern that it is politically biased; in particular, it has a left-libertarian orientation. Nevertheless, in light of recent trends in attempts to reduce such biases, this study re-evaluated the political biases of ChatGPT using political orientation tests and the application programming interface. Moreover, the effects of the languages used in the system as well as gender and race settings were evaluated. The results indicated that ChatGPT had less political bias than previously thought; however, they did not entirely discount the political bias. The languages used in the system and the gender and race settings may induce political biases. These findings enhance our understanding of the political biases of ChatGPT and may be useful for bias evaluation and designing ChatGPT’s operational strategy.