
Senior Software Engineer (Java/Python/LLM)
BinancePosted 2/24/2025

Senior Software Engineer (Java/Python/LLM)
Binance
Job Location
Job Summary
This Senior Software Engineer role at Binance involves managing data processing pipelines, optimizing large language model (LLM) prompts, and collaborating with cross-functional teams. The specialist will work on scaling AI-driven Customer Service chat analysis, ensuring timely and high-quality data inputs to AI models, and driving bot adoption. Responsibilities include designing LLM prompts, conducting A/B testing, developing back-end services, and troubleshooting performance bottlenecks. The ideal candidate has 5+ years of experience in Java data processing and pipeline development, proficiency in Java/Python, and hands-on experience with AWS/GCP/Azure. Binance offers a competitive salary, company benefits, and work-from-home arrangement.
Job Description
Responsibilities:
- Work with data warehouse team, to maintain real time data pipeline of CS business domain, to ensure timely and high-quality data inputs to the AI models.
- Design, implement, and refine LLM prompts for enhanced attribution analysis of CS conversations.
- Work closely with Ops teams to increase the stability of LLM solutions in a cloud or on-prem environment.
- Conduct A/B testing and iterative improvements on LLM performance, and maintaining result accuracy.
- Develop and maintain back-end services and automation scripts using Java or Python, ensuring seamless integration with existing systems.
- Troubleshoot performance bottlenecks, optimize code, and safeguard data security and privacy.
- Collaborate with CS, BI, and SL teams to gather requirements, clarify feature demands, and communicate technical constraints.
- Maintain clear, concise documentation of data work flows, LLM prompt templates, and best practices.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
- 5+ years of hands-on experience in Java data processing and pipeline development.
- Experience with SQL/NoSQL databases, ETL pipelines, and data workflow orchestration tools (e.g., Airflow).
- Proficient in Java / Python (at least one language) and Hive & Spark SQL programming languages.
- Hands-on experience with AWS, GCP, or Azure is a plus.
- Experience with LLM or NLP projects is highly desired
- Excellent communication and teamwork skills, with the ability to work in a cross-functional environment.
- Fluency in English is required to be able to coordinate with overseas partners and stakeholders. Additional languages would be an advantage.