
Binance Accelerator Program - Data Warehouse Engineer
BinancePosted 2/10/2025

Binance Accelerator Program - Data Warehouse Engineer
Binance
Job Location
Job Summary
The Binance Accelerator Program is a 3-6 month program for early career talent to experience life at Binance and gain an immersive understanding of the company's operations. As a Data Warehouse Engineer, you will be responsible for building a universal and flexible data warehouse system that can quickly support business needs. You will work with a talented team, participate in technical learning growth, and contribute to the team's overall knowledge accumulation and skill improvement. The program offers competitive salary, company benefits, and a work-from-home arrangement. Binance is committed to being an equal opportunity employer and values diversity in its workforce.
Job Description
Requirements
- Undergraduate in a quantitative discipline, such as Mathematics/Statistics, Actuarial Sciences, Computer Science, Engineering, or Life Sciences.
- Understanding of data warehouse modeling and data governance. Knowledge of data warehouse development methodology, including dimensional modeling, information factory etc.
- Proficient in Java / Scala / Python (at least one language) and Hive & Spark SQL programming languages.
- Familiar with OLAP technology (such as: kylin, impala, presto, druid, etc.).
- Knowledgable in Big Data batch pipeline development.
- Familiar with Big Data components including but not limited to Hadoop, Hive, Spark, Delta lake, Hudi, Presto, Hbase, Kafka, Zookeeper, Airflow, Elastic search, Redis, etc.
- Experiences with AWS Big Data services are a plus.
- Have a strong team collaboration attitude and develop partnerships with other teams and businesses.
- Experience in real-time data processing, familiar with stream processing frameworks such as Apache Kafka, Apache Flink, in-depth knowledge of Lakehouse technology, practical project experience, proficiency in StarRocks, including its data model design, query optimization and performance tuning.
- Experience in knowledge graph construction and application, and knowledge of graph databases such as Nebula, etc.
Responsibilities
- According to the company's data warehouse specifications and business understanding, build a universal and flexible data warehouse system that can quickly support the needs and reduce repetitive development work efforts.
- Data model design, development, testing, deployment, online data job monitoring, and the ability to quickly solve complex problems, especially the optimization of complex calculation logic and performance tuning, etc.
- Participate in Data governance, including the construction of the company’s metadata management system and data quality monitoring system.
- Design and implement a data platform integrated with data lake warehouse to support real-time data processing and analysis requirements.
- Build knowledge graph, and provide in-depth business insight.
- Participate in technical team building and learning growth, and contribute to the team’s overall knowledge accumulation and skill improvement.