
Data Cloud Engineer (DataOps)
Trafilea
- Location
- Uruguay
- Posted
Design and maintain scalable data platforms in the cloud for Trafilea's e-commerce businesses.
Trafilea
Design and maintain scalable data platforms in the cloud for Trafilea's e-commerce businesses.
Udacity
Backend Software Engineer at Udacity: build scalable microservices with production quality, participate in design reviews, and manage cloud resources.
Vonage
Full stack developer for Vonage AI Studio, building AI platforms and conversational experiences
Level AI
Coordinate SaaS instance setups, manage technical configurations, and ensure successful customer onboarding at Level AI. Work with cross-functional teams to mitigate risks and deliver high-quality implementations.
WorkOS
Join WorkOS as a Security Engineer to enhance our security infrastructure, ensuring the safety of authentication and identity solutions for thousands of SaaS customers. Lead security projects, mentor teams, and implement best practices using tools like SCA, SAST, DAST, and CNAPP.
Wealthfront
Join Wealthfront's Security Engineering team as a security-minded engineer to build and mature security solutions in a fast-growing fintech organization.
SADA India
Customer Engineer at SADA India - Design and deliver customized solutions for clients, collaborate with engineers and sales teams, and provide training and enablement sessions.
StackAdapt
Join StackAdapt as an Infrastructure Engineer to optimize AWS infrastructure, implement security measures, and enhance observability. Utilize tools like Terraform, Kubernetes, and more for robust system performance and reliability.
BVNK
Join BVNK as a DevOps Engineer to deploy and manage global payment infrastructure, leveraging cutting-edge tools and fostering collaboration across diverse teams.
Checkr
DevOps Engineer at Checkr: Build and run cloud infrastructure, drive platform adoption, and ensure scalability and security.
Spruce
Develop mobile applications for Spruce's privacy-first digital identity wallet, focusing on Android and iOS platforms using Kotlin and Swift.
SwissBorg
Fullstack Engineer for SwissBorg - Develop responsive full-stack applications, engage with designers & PMs, and contribute to on-chain ecosystem growth.
Binance
Frontend Engineer at Binance: Develop user-facing features for cryptocurrency exchange, optimize web applications, and collaborate with a global team.
Binance
Frontend Engineer at Binance: Lead high-velocity projects, mentor junior devs, and contribute to innovative blockchain solutions.
Apollo.io
Work as a Frontend Engineer at Apollo, collaborating with cross-functional teams to implement best practices. Use React/Redux/Node.js on the frontend and Ruby/Rails/MongoDB/Elasticsearch on the backend. Focus on quality, automation, performance improvements, and innovation while maintaining code quality and delivering high-quality solutions.
Binance
Senior Frontend Engineer at Binance: Build innovative solutions for KYC platform, collaborate with world-class talent, and shape the future of finance.
Binance
Frontend Engineer at Binance: Develop website features & APIs, collaborate with teams, & implement security best practices.
Uberall
Join Uberall as a Mid-Level Frontend Engineer with expertise in React, collaborate on next-generation web applications, and contribute to the company's mission to empower businesses locally.
Hightouch
Solutions Engineer at Hightouch: Solve customer data challenges, collaborate with stakeholders, and contribute to innovative solutions while earning a competitive salary and equity options.
Syndigo
Solutions Engineer for pre-sales with expertise in APIs, cloud-native tech, and industry trends
Trafilea
Trafilea is a dynamic tech e-commerce group that operates multiple direct-to-consumer brands in the intimate apparel and beauty sectors. The company fosters a culture of collaboration, innovation, and continuous learning, with a remote-first approach that allows employees to work from anywhere in the world. As a Data Cloud Engineer at Trafilea, you will design and maintain scalable data platforms in the cloud, ensuring high availability, performance, and reliability of our data infrastructure. You will work closely with various teams to enable real-time and batch data processing, ensure data security and compliance, and optimize cloud resources for large-scale analytics and machine learning workloads. With a focus on investing in people and providing development opportunities, Trafilea offers a safe space for employees to grow professionally and personally. The company has been recognized as one of the Top 25 Companies for Remote Workers by Forbes and FlexJobs.
About Trafilea
Trafilea is a dynamic and innovative Tech E-commerce Group that operates multiple direct-to-consumer brands in the intimate apparel and beauty sectors, with a focus on using data-driven strategies to scale their businesses. In addition to our products, we have our own online community dedicated to promoting body positivity. As a rapidly growing global player, Trafilea is committed to creating high-quality products and services that enhance the customer experience and drive long-term growth.
At Trafilea, we foster a culture of collaboration, innovation, and continuous learning. We believe in investing in our people and providing them with the support and development opportunities they need to grow both personally and professionally. With our remote-first approach, you'll have the freedom to work from anywhere in the world, surrounded by a diverse and talented team that spans the globe.
🌟 Role Mission
The Data Cloud Engineer is responsible for designing and maintaining scalable, secure, and efficient data platforms in the cloud. You will work closely with Data Engineers, DevOps, and Analytics teams to ensure high availability, performance, and reliability of our data infrastructure.
Your mission is to enable real-time and batch data processing, ensure data security and compliance, and optimize cloud resources to support large-scale analytics and machine learning workloads.
🛠️ Responsibilities
🔹 Cloud Data Infrastructure & Optimization
Design and implement scalable data architectures in AWS (Redshift, Athena, Glue, S3, DynamoDB).
Optimize data storage, processing, and retrieval strategies for performance and cost efficiency.
Implement data lake and data warehouse solutions to support analytics and BI teams.
🔹 Security & Compliance
Ensure data security, encryption, and access control policies are in place.
Implement GDPR, SOC 2, and compliance best practices for data governance.
Monitor and prevent data breaches, unauthorized access, and performance issues.
🔹 Observability & Performance Monitoring
Implement monitoring and alerting solutions for data pipelines (Datadog, Prometheus).
Troubleshoot performance bottlenecks and improve query execution times.
🔹 Collaboration & Knowledge Sharing
Work closely with Data Scientists, BI Analysts, and DevOps teams to align data solutions with business needs.
Document data workflows, best practices, and infrastructure standards.
Provide guidance on data engineering best practices and mentor junior engineers.
What we offer
Collaborate with world-class talents in a data-driven, dynamic, energetic work environment.
Opportunity to grow and develop both professionally and personally.
Safe space to be who you truly are, with a commitment to diversity, equity, and inclusion.
Openness to new ideas and initiatives.
Great benefits package including remote work, 15 working days of paid holidays, Learning subsidy, and more!
We've been recognized by Forbes and FlexJobs as one of the Top 25 Companies for Remote Workers. Apply now!
🎓 Qualifications
✅ Must-Have Skills
✔️ 3+ years of experience as a Data Cloud Engineer, Data Engineer, or Cloud Engineer.
✔️ Strong experience with AWS cloud services for data processing (Redshift, Athena, Glue, S3, DynamoDB, Kinesis).
✔️ Expertise in SQL and distributed databases (PostgreSQL, Snowflake, BigQuery, ClickHouse).
✔️ Hands-on experience with ETL/ELT pipelines, automation, and orchestration tools (Airflow, Step Functions, dbt).
✔️ Strong knowledge of Infrastructure as Code (Terraform, CloudFormation) for data infrastructure.
✔️ Experience working with large-scale data processing frameworks (Apache Spark, Flink, EMR).
✔️ Proficiency in Python, Bash, or similar scripting languages for automation.
✅ Nice-to-Have Skills
➕ Experience with streaming data architectures (Kafka, Kinesis, Apache Flink, RabbitMQ).
➕ Knowledge of machine learning pipelines and feature stores.
➕ Experience with Data Governance, Data Quality, and Security Best Practices.
➕ Understanding of FinOps principles for cost optimization in cloud data solutions.