A Bayesian approach to generalised partial linear regression models is proposed, where regression functions are modelled nonparametrically using regression splines, with assumptions about shape and smoothness. The knots may be modelled as fixed or free, incorporating a reversible-jump Markov chain Monte Carlo algorithm for the latter. The modelling framework along with vague prior distributions provides more flexibility compared with other Bayesian constrained smoothers; further, the method is simpler, more intuitive, easier to implement, and computationally faster. Inference concerning parametrically modelled covariates can be accomplished using approximate marginal distributions, with standard Bayes model selection methods for more general inference. Simulations show that the inference methods have desirable Bayesian and frequentist properties. In particular, these methods often perform similarly to standard parametric methods when the parametric assumptions are met and are superior when the assumptions are violated. The R code to implement the methods described here is available at www.stat.colostate.edu/similar to meyer/code.htm.