Orthogonal frequency division multiplexing (OFDM), one of the most dominant technology for fifth-generation (5G) wireless communication systems offers a high data rate with better spectrum efficiency. However, OFDM based systems suffer from various channel imperfections such as channel distortion, carrier frequency offset (CFO) due to transceiver local oscillator frequency mismatch, sampling frequency offset (SFO) between DAC/ADCs, nonlinear distortion (NLD) due to the nonlinear power amplifier, and fading caused by frequency selective channels. Recently OFDM based non-orthogonal multiple access (NOMA) scheme is attracting researchers’ attraction owing to its high spectral efficiency, massive connectivity, and resilience to frequency selectivity for the upcoming 5G wireless communication and beyond. However, NOMA-OFDM systems suffer the same demerits as OFDM-based systems to deploy such a multicarrier system for 5G and beyond applications. This paper studies the problem of various channel imperfections on OFDM and OFDM-NOMA systems and proposes machine learning (ML) based estimation and signal detection for OFDM and NOMA-OFDM based systems in presence of the aforementioned channel imperfections. Various ML models like neural network (NN), Recurrent-NN, long-short-term-memory are adopted in this research for channel estimation-equalization, CFO-SFO estimation and compensation and mitigation of NLD in OFDM and NOMA-OFDM receiver. Extensive simulation studies show that the ML-based techniques outperform traditional LS, MMSE-based methods for OFDM receiver, and LS-SIC, MMSE-SIC based methods for NOMA-OFDM receiver.