Roles & Responsibilities
1. Responsible for the work related to big model fine-tuning, including but not limited to: data sample preparation, efficient training, etc.
2. Explore and study how to efficiently construct SFT training samples from the original text of classic IP novels to improve the model's dialogue ability
3. Improve the big model's ability to follow complex prompts and fully tap the potential of the big model
4. Improve the big model's ability to combine retrieval, explore efficient model knowledge embedding methods and model knowledge online learning and updating, etc.
Qualifications
1. Master degree or above in computer science and related fields
2. Proficient in Python, familiar with Linux environment development, and proficient in using deep learning frameworks TensorFlow or PyTorch
3. Familiar with cutting-edge deep learning and NLP algorithms, familiar with model structures such as Transformer and GPT, and understand large model fine-tuning (SFT) related technologies
4. Have a strong sense of responsibility and teamwork spirit, a spirit of technical research, a positive attitude, and the ability to actively integrate into the team
5. Preference will be given to those who have practical experience in large model fine-tuning or have published high-quality papers in top conferences in the field of deep learning.