Segmenting histopathological image automatically is an important task in computer-aided pathology analysis. However, it is challenging to segment and analyze digitalized histopathology images due to the large size of WSI, diversity and complexity of features. In this paper, we propose a multi-resolution attention and multi-scale convolution network (MAMC-Net) for the automatic tumor segmentation of WSI. First, the proposed MAMC-Net design the multi-resolution attention module that utilizes multi-resolution images as the pyramid inputs to generate a wider range feature information and richer details. Specifically, we employ an attention mechanism at each level to capture discriminative features related with the segmentation task. Furthermore, a multi-scale convolution module is designed to multi-scale feature representation by aggregating intact semantic information from the deep layer of encoder and high-resolution details from the final layer of decoder. To further obtain the accurate segmentation results, we adopt a fully connected Conditional Random Field (CRF) to splice the overlapping maps to avoid discontinuities and inconsistencies of cancer boundaries. Finally, we demonstrate the effectiveness of our framework on open-source datasets, including CAME-LYON17 (breast cancer metastases) and BOT (gastric cancer) datasets. The experimental results show that our proposed MAMC-Net obtains superior performance compared with other state-of-the-art methods, such as a Dice coefficient (DSC) of 0.929, an IOU score of 0.867, recall of 0.933 on the breast cancer dataset, a Dice coefficient (DSC) of 0.89, an IOU score of 0.802, recall of 0.903 on the gastric cancer dataset.