
Principal Engineer - Data Platform
Level AI
- Location
- India
- Posted
Lead the design and development of Level AI's data warehouse and analytics platform, collaborate on scalable systems, mentor junior engineers, and drive engineering excellence.
Level AI
Lead the design and development of Level AI's data warehouse and analytics platform, collaborate on scalable systems, mentor junior engineers, and drive engineering excellence.
Kontakt.io
Data Platform Engineer at Kontakt.io: Design & build scalable data infrastructure for real-time analytics & automation
Level AI
Staff Software Engineer at Level AI: Design & develop scalable analytics platform
Lime
Data Platform Engineer at Lime: build scalable platforms, lead new product initiatives, and mentor other engineers in a fast-growing company with competitive salaries and benefits.
Udacity
Principal Data Analyst at Udacity
Spotify
Join Spotify's Platform team as a data analyst & report builder, working closely with cross-functional teams to drive growth & scalability.
Develop and enforce policies for Reddit's data licensing and AI initiatives while ensuring user safety and rights. Collaborate with Legal, Product, and Engineering to support responsible growth of the platform.
SpryPoint
Platform Engineer - AWS services, containerization, cloud infrastructure
Lead innovative ad-tech design initiatives at Reddit, driving user-centric solutions and shaping the Ads Manager platform.
Avara
Lead the development of regulated payments infrastructure at Avara, leveraging blockchain technology to enable financial access for billions. Collaborate with a diverse team in a dynamic environment.
Level AI
Principal Software Engineer at Level AI: Develop scalable systems, coach junior engineers, and drive best practices.
ScienceLogic
Principal Engineer at ScienceLogic: drive innovation, collaboration, and growth in network automation product line
Degreed
Data Engineer at Degreed: Build and maintain data infrastructure to fuel product innovation
Reka
Join Reka as a Data Engineer to build scalable data pipelines, collaborate with researchers and engineers on cutting-edge AI projects, and contribute to the development of multimodal foundation models using tools like Python, Hadoop, Spark, AWS, Azure, Google Cloud, Docker, and Kubernetes. Enjoy benefits including 4 weeks paid leave, visa support, and comprehensive healthcare.
Yuno
Data Engineer at Yuno: Design & build scalable data pipelines, collaborate with teams, & drive innovation in payment infrastructure.
Pagos
Build and maintain scalable data pipelines using technologies like SQL, Redshift, Apache Spark, and cloud platforms. Collaborate with backend engineers and contribute to our payments platform.
Red Canary
Design and build scalable data infrastructure at Red Canary. Develop and manage data products, ensuring high-quality data for cross-functional teams to drive innovation and growth.
Eneba
Data Engineer at Eneba: build data pipelines, collaborate with ML engineers, and improve internal processes.
StackAdapt
Join StackAdapt as a Data Engineer to design scalable data pipelines and implement ML algorithms for their advertising platform. Enjoy competitive salary, equity, RRSP matching, health benefits, work from home support, training programs, mentorship, and a supportive culture in this remote-first role.
Shippo
Lead cross-functional projects at Shippo to modernize data infrastructure and foster self-service analytics. Manage strategic initiatives, improve data reliability, and align with business goals while working in a remote-friendly environment.
Level AI
Level AI is a Series C startup that revolutionizes customer engagement through an AI-native platform. We are seeking a Principal Software Engineer to lead the design and development of our data warehouse and analytics platform, as well as raise the engineering bar for our technology stack. The role involves collaborating with team members and the wider engineering community to develop scalable systems, acting as a technical thought leader, mentoring junior engineers, and driving best practices.
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.
Position Overview : We seek an experienced Principal Software Engineer to lead the design and development of our data warehouse and analytics platform in addition to help raise the engineering bar for the entire technology stack at Level AI, including applications, platform and infrastructure.
They will actively collaborate with team members and the wider Level AI engineering community to develop highly scalable and performant systems. They will be a technical thought leader who will help drive solving complex problems of today and the future by designing and building simple and elegant technical solutions. They will coach and mentor junior engineers and drive engineering best practices. They will actively collaborate with product managers and other stakeholders both inside and outside the team.
Competencies :
Data Modeling: Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalization and denormalization techniques.
Data Warehousing & Storage Solutions: Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.
ETL/ELT Processes: Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, dbt) to facilitate data movement from source systems to the data warehouse.
SQL Proficiency: Advanced SQL skills for complex queries, indexing, and performance tuning.
Programming Skills: Strong in Python or Java for building custom data pipelines and handling advanced data transformations.
Data Integration: Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.
Data Pipeline Management: Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines.
APIs and Data Feeds: Knowledgeable in API-based integrations, especially for aggregating data from distributed sources.
Responsibilities -
Design and implement analytical platforms that provide insightful dashboards to customers.
Develop and maintain data warehouse schemas, such as star schemas, fact tables, and dimensions, to support efficient querying and data access.
Oversee data propagation processes from source databases to warehouse-specific databases/tools, ensuring data accuracy, reliability, and timeliness.
Ensure the architectural design is extensible and scalable to adapt to future needs.
Requirement -
Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1/2 Engineering institutes with relevant work experience with a top technology company.
9+ years of Backend and Infrastructure Experience with a strong track record in development, architecture and design.
Hands on experience with large scale databases, high scale messaging systems and real time Job Queues.
Experience navigating and understanding large scale systems and complex code-bases, and architectural patterns.
Proven experience in building high scale data platforms.
Strong expertise in data warehouse schema design (star schema, fact tables, dimensions).
Experience with data movement, transformation, and integration tools for data propagation across systems.
Ability to evaluate and implement best practices in data architecture for scalable solutions.
Experience mentoring and providing technical leadership to other engineers in the team.
Nice to have:
Experience with Google Cloud, Django, Postgres, Celery, Redis.
Some experience with AI Infrastructure and Operations.
₹0 - ₹0 a year
Compensation : We offer market-leading compensation, based on the skills and aptitude of the candidate.
To learn more visit : https://thelevel.ai/
Funding : https://www.crunchbase.com/organization/level-ai
LinkedIn : https://www.linkedin.com/company/level-ai/
Our AI platform : https://www.youtube.com/watch?v=g06q2V_kb-s