Accelerator physics presents unique challenges for natural language processing (NLP) due to its specialized terminology and complex concepts.A key component in overcoming these challenges is the development of robust text embedding models click here that transform textual data into dense vector representations, facilitating efficient information retrieval and semantic understanding.In this work, we introduce AccPhysBERT, a sentence embedding model here fine-tuned specifically for accelerator physics.Our model demonstrates superior performance across a range of downstream NLP tasks, surpassing existing models in capturing the domain-specific nuances of the field.We further showcase its practical applications, including semantic paper-reviewer matching and integration into retrieval-augmented generation systems, highlighting its potential to enhance information retrieval and knowledge discovery in accelerator physics.