Data Modeller - Insurance
RackspaceJob Summary
We are seeking a skilled Data Modeller with a strong background in the Insurance industry. The ideal candidate will have experience working with industry data models or building custom models. Familiarity with Databricks and Lakehouse architecture is highly advantageous. Key responsibilities include analyzing business needs, developing data models for core domains like Customer Master data, Claims, Underwriting, Actuary, Customer Complaints/Support, and Finance. The role involves designing database solutions, collaborating with data architects and engineers, ensuring compliance with audit requirements, and overseeing data migration from legacy systems. Requirements include proven experience as a Data Modeller in Insurance, exposure to data modeling tools like ER/Studio, strong SQL knowledge, familiarity with cloud technologies (Azure/AWS), ETL processes, and data warehousing concepts.
Location:Bangalore
Job Type: Full-Time
About the Role: We are seeking a skilled Data Modeller with a strong background in the Insurance industry. The ideal candidate will have experience working with industry data models or building custom models. Familiarity with Databricks and Lakehouse architecture is highly advantageous.
Key Responsibilities:
Analyze and translate business needs into data models.
Develop conceptual, logical, and physical data models to support business processes.
Has experience in building data models for core domains: Customer Master data, Claims, Underwriting, Actuary, Customer Complains/support and Finance.
Design and implement effective database solutions and models to store
Work closely with data architects, data engineers, and database administrators to create and manage data systems.
Ensure data models comply with internal and external audit requirements.
Oversee the migration of data from legacy systems to new solutions.
Requirements:
Proven experience as a Data Modeller in the Insurance industry.
Experience with industry data models or building custom data models.
Exposure to data modeling tools such as ER/Studio, ERwin, or PowerDesigner.
Strong knowledge of SQL and database management systems like Oracle, SQL Server, or MySQL.
Familiarity with Databricks and Lakehouse architecture.
Excellent analytical and problem-solving skills.
Strong communication and collaboration skills.
Experience with cloud technologies like Azure or AWS.
Knowledge of ETL processes and data warehousing concepts.