Despite achieving dramatic advances, current zero-shot sketch-based image retrieval (ZS-SBIR) models are mainly challenged by two issues. They must overcome enormous domain gap between sketches and images along with significant intra-class variations. In addition, the scarcity of sketch samples may result in degraded feature representation learning from the training data, thereby limiting the model's generalization ability of recognizing different categories or instances. To address these issues, we propose a novel ZS-SBIR model in this study. On the one hand, a hybrid information fusion module is introduced to combine both the structured information (e.g., shapes and contours) and sequential information (e.g., stroke order) of sketches and images for a comprehensive representation. Furthermore, this module is capable of capturing multi-level feature representations of sketches, such that the model is better adapted to different types of sketches. On the other hand, both intra- and inter-sample correlation between sketches and images are modeled via a dual-sample relationship modeling module. To be specific, it can fully take advantage of each sample in data-scarce scenarios by extracting complementary semantic information among different images in a mini-batch, which allows the model to learn more robust feature representations and improve the discrimination of unseen class features. Extensive experiments conducted on three public benchmark datasets demonstrate the superiority of our proposed method to the state-of-the-art ZS-SBIR models.