Support Vector Regression (SVR) and its extensions have demonstrated effectiveness in addressing regression problems, yet they face challenges, including high computational costs for large-scale datasets and sensitivity to outliers. To mitigate these limitations, various techniques such as twin SVR (TWSVR) and robust TWSVR (RTWSVR) have been proposed. However, existing approaches may suffer from issues like alignment of the final regressor with the dataset during the learning processes. In this paper, we introduce an extended twin parametric margin SVR (ETPMSVR) model inspired by the principles of robust geometric TPMSVM (RGTPSVM). The ETPMSVR addresses these challenges by integrating the average of & varepsilon;\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\epsilon$$\end{document}-insensitive upper and lower bounds of the regressor into the objective function and constraints, ensuring alignment with the dataset and hence the final regerssor is found along with two boundary hyperplanes during the training process in a single quadratic programming problem. Additionally, a hinge-loss function is incorporated to enhance robustness against outliers. We derive the dual formulation to improve computational efficiency. Experiments on a diverse range of datasets, 10 UCI datasets and 8 S &P index datasets from financial market, demonstrate the efficacy of the proposed model in comparison to various benchmarks, including TWSVR, RTWSVR, multi-layer perceptron, and long short-term memory network.