
Data Cloud Engineer (DataOps)
Trafilea
- Location
- Uruguay
- Posted
Design and maintain scalable data platforms in the cloud for Trafilea's e-commerce businesses.
Trafilea
Design and maintain scalable data platforms in the cloud for Trafilea's e-commerce businesses.
Willow
Join Willow as a Smart Buildings Engineer/Digital Twin Technician / Technologist and contribute to the deployment of Digital Twins across global portfolios.
Planet
IT Systems Engineer at Planet Labs Germany GmbH - Manage IT systems, provide expert-level support, and ensure seamless operations across multiple regions.
CloudWalk
Database management job at fintech company CloudWalk
Gitlab
Manage GitLab’s ServiceNow implementation, handle service requests, optimize processes, create dashboards, and ensure system scalability while working remotely with comprehensive benefits.
Lime
NetSuite Developer Lead for Financials and Supply Chain modules with expertise in SuiteScript, SuiteFlow, and SuiteTalk
Gitlab
Backend Engineer for Ruby on Rails with high-scale data pipelines and distributed systems expertise
Apollo.io
Executive Recruiter at Apollo.io: source & hire senior-level executives for tech & business roles, collaborate with leadership, and manage candidate experience.
SwissBorg
SwissBorg SRE Engineer: Design cloud services architecture, troubleshoot issues, deploy solutions, and scale the system for Bull Run.
Anywhere365
Join Anywhere365 as an SRE to enhance system reliability and performance using Azure, Kubernetes, and cloud technologies. Collaborate on disaster recovery plans and optimize infrastructure for low-latency services like VoIP and video conferencing.
Sporty Group
Join Sporty as a SRE/DevOps engineer and deploy into new regions with AWS technologies
SFOX
Remote Site Reliability Engineer job on AWS with Kubernetes experience required
Conga
Conga Sales Engineer: Partner with sales counterparts to drive customer success and growth, leveraging industry expertise and experience with enterprise applications.
OpenTable
Develop and customize Salesforce applications for internal teams at OpenTable, leveraging Lightning Components, Apex, and integration tools.
TTEC Digital
Oversee database implementations across cloud platforms, contribute to architecture frameworks, manage IP databases, and ensure exceptional customer experiences for TTEC Digital's clients.
TTEC Digital
Principal Database Engineer for TTEC Digital's IP Product Engineering team, overseeing database implementations and contributing to Architecture/Design framework.
TTEC Digital
As a Principal Database Engineer at TTEC Digital, you will lead database implementations, contribute to architecture design, and ensure exceptional customer experiences through innovative solutions. Work with cutting-edge technologies and collaborate with talented teams to deliver impactful results.
G-P
Design and deploy AI solutions using advanced machine learning techniques, including LLMs. Collaborate with teams to ensure business objectives are met while maintaining high-quality ML services.
Brilliant
Backend-leaning software engineer for Growth and Lifecycle domains at Brilliant, with 5+ years of experience in software engineering and AI-interoperable systems.
Kalepa
Join our team as a Core Engineer to work on cutting-edge technologies like NLP and machine learning. Collaborate with a global team of engineers in full-stack, data, ML, and DevOps roles. Enjoy competitive salary, equity options, excellent benefits, and other perks.
Trafilea
Trafilea is a dynamic tech e-commerce group that operates multiple direct-to-consumer brands in the intimate apparel and beauty sectors. The company fosters a culture of collaboration, innovation, and continuous learning, with a remote-first approach that allows employees to work from anywhere in the world. As a Data Cloud Engineer at Trafilea, you will design and maintain scalable data platforms in the cloud, ensuring high availability, performance, and reliability of our data infrastructure. You will work closely with various teams to enable real-time and batch data processing, ensure data security and compliance, and optimize cloud resources for large-scale analytics and machine learning workloads. With a focus on investing in people and providing development opportunities, Trafilea offers a safe space for employees to grow professionally and personally. The company has been recognized as one of the Top 25 Companies for Remote Workers by Forbes and FlexJobs.
About Trafilea
Trafilea is a dynamic and innovative Tech E-commerce Group that operates multiple direct-to-consumer brands in the intimate apparel and beauty sectors, with a focus on using data-driven strategies to scale their businesses. In addition to our products, we have our own online community dedicated to promoting body positivity. As a rapidly growing global player, Trafilea is committed to creating high-quality products and services that enhance the customer experience and drive long-term growth.
At Trafilea, we foster a culture of collaboration, innovation, and continuous learning. We believe in investing in our people and providing them with the support and development opportunities they need to grow both personally and professionally. With our remote-first approach, you'll have the freedom to work from anywhere in the world, surrounded by a diverse and talented team that spans the globe.
🌟 Role Mission
The Data Cloud Engineer is responsible for designing and maintaining scalable, secure, and efficient data platforms in the cloud. You will work closely with Data Engineers, DevOps, and Analytics teams to ensure high availability, performance, and reliability of our data infrastructure.
Your mission is to enable real-time and batch data processing, ensure data security and compliance, and optimize cloud resources to support large-scale analytics and machine learning workloads.
🛠️ Responsibilities
🔹 Cloud Data Infrastructure & Optimization
Design and implement scalable data architectures in AWS (Redshift, Athena, Glue, S3, DynamoDB).
Optimize data storage, processing, and retrieval strategies for performance and cost efficiency.
Implement data lake and data warehouse solutions to support analytics and BI teams.
🔹 Security & Compliance
Ensure data security, encryption, and access control policies are in place.
Implement GDPR, SOC 2, and compliance best practices for data governance.
Monitor and prevent data breaches, unauthorized access, and performance issues.
🔹 Observability & Performance Monitoring
Implement monitoring and alerting solutions for data pipelines (Datadog, Prometheus).
Troubleshoot performance bottlenecks and improve query execution times.
🔹 Collaboration & Knowledge Sharing
Work closely with Data Scientists, BI Analysts, and DevOps teams to align data solutions with business needs.
Document data workflows, best practices, and infrastructure standards.
Provide guidance on data engineering best practices and mentor junior engineers.
What we offer
Collaborate with world-class talents in a data-driven, dynamic, energetic work environment.
Opportunity to grow and develop both professionally and personally.
Safe space to be who you truly are, with a commitment to diversity, equity, and inclusion.
Openness to new ideas and initiatives.
Great benefits package including remote work, 15 working days of paid holidays, Learning subsidy, and more!
We've been recognized by Forbes and FlexJobs as one of the Top 25 Companies for Remote Workers. Apply now!
🎓 Qualifications
✅ Must-Have Skills
✔️ 3+ years of experience as a Data Cloud Engineer, Data Engineer, or Cloud Engineer.
✔️ Strong experience with AWS cloud services for data processing (Redshift, Athena, Glue, S3, DynamoDB, Kinesis).
✔️ Expertise in SQL and distributed databases (PostgreSQL, Snowflake, BigQuery, ClickHouse).
✔️ Hands-on experience with ETL/ELT pipelines, automation, and orchestration tools (Airflow, Step Functions, dbt).
✔️ Strong knowledge of Infrastructure as Code (Terraform, CloudFormation) for data infrastructure.
✔️ Experience working with large-scale data processing frameworks (Apache Spark, Flink, EMR).
✔️ Proficiency in Python, Bash, or similar scripting languages for automation.
✅ Nice-to-Have Skills
➕ Experience with streaming data architectures (Kafka, Kinesis, Apache Flink, RabbitMQ).
➕ Knowledge of machine learning pipelines and feature stores.
➕ Experience with Data Governance, Data Quality, and Security Best Practices.
➕ Understanding of FinOps principles for cost optimization in cloud data solutions.