Welded I-section steel exposed to harsh environments is highly susceptible to corrosion, with local corrosion being particularly common and detrimental. Local corrosion causes uneven changes in plate thickness, leading to increased stress concentration and premature buckling in corroded regions, and a subsequent compromise in the stability. However, the influence of local corrosion on the buckling behaviours has not been clarified in prior research. This paper addresses this gap by designing and testing 12 welded I-section steel columns with simulated local corrosion (varying slenderness ratios, corrosion locations, corrosion depths and corrosion dimensions) under axial compression. Electrochemical accelerated corrosion tests were conducted to effectively simulate local corrosion, and the corrosion morphology was assessed using three-dimensional scanning technology. Axial compression tests were then conducted to study the influence of local corrosion on failure modes, loaddisplacement curves and cross-sectional strain development of corroded steel columns. Additionally, finite element (FE) models incorporating the scanned corrosion morphology were established and validated against test results for parametric analysis. The key parameters included geometric dimensions (slenderness ratio, widthto-thickness ratio, and thickness ratio) and corrosion region characteristics (location, depth, and dimension). Both experimental and numerical results revealed that local corrosion altered the buckling modes and decreased the residual resistance of the welded I-section steel columns under axial compression. Furthermore, the equivalent thickness and initial eccentricity coefficient were defined and incorporated into the existing codified approaches for predicting the residual resistance of welded I-section steel columns with local corrosion under axial compression.