
Data Cloud Engineer (DataOps)
Trafilea
- Location
- Uruguay
- Posted
Design and maintain scalable data platforms in the cloud for Trafilea's e-commerce businesses.
Trafilea
Design and maintain scalable data platforms in the cloud for Trafilea's e-commerce businesses.
CentML
Solutions Engineer at CentML: Collaborate with customers to craft innovative AI solutions using cutting-edge technology.
Twilio
Solutions Engineer - Segment at Twilio, partner with Account Executives, build product ecosystem, lead technical evaluations, develop CDP expertise
Syndigo
Join Syndigo as a Solutions Engineer and craft compelling client experiences, drive competitive edge, and shape the future of pre-sales. 5+ years of experience in Solution Engineering required.
ElevenLabs
Automate vendor due diligence and security questionnaires for ElevenLabs, a rapidly growing startup pioneering AI voice models and products.
G-P
Design and develop cloud-enabled products using modern frameworks; collaborate with Product and UX teams; advocate agile methodologies; build prototypes and production features; contribute to SDLC evolution at G-P.
Clipboard Health
Senior Security Operations Engineer needed to define & improve security processes & systems, focusing on SIEM management & incident response in a 100% remote role.
WorkOS
Join WorkOS's AuthKit team as a Senior Software Engineer to build identity and authentication solutions for developers. Design and engineer developer experiences, collaborate with teams, gather customer feedback, and contribute to innovative projects in a fast-paced environment.
Twilio
Design and develop full-stack demo environments for Twilio's customers
Cyberhaven
Design and implement advanced system sensors for data protection using AI at Cyberhaven. Work on security software development, optimize performance, build secure systems, and collaborate with cross-functional teams to enhance endpoint security solutions.
Newsela
Remote School Sales Representative role at Newsela, selling reading content to schools with flexible work options and equity.
Reka
Enhance Reka's inference infrastructure by developing scalable backend services using modern containerization and cloud technologies. Collaborate with a global team on innovative AI solutions while enjoying remote work flexibility and excellent benefits.
Gitlab
Support GitLab’s EMEA Sales teams by managing the recruitment process, building pipelines, and ensuring a positive candidate experience. Utilize data-driven strategies to enhance hiring effectiveness and promote company culture.
brightwheel
Join Brightwheel's Talent team as a Recruiter and drive hiring strategies for high-volume sales/customer success roles.
Rackspace
Build and market AI solutions using AWS technologies at Rackspace Technology as an AI Build Engineer. Work in a small team to develop innovative AI platforms and go-to-market strategies.
Pismo
Global Recruiter for tech & product roles, utilizing data-driven recruitment strategies and collaborating with hiring managers.
Sedgwick
Claims Team Lead - Workers Compensation: Supervise multiple teams of examiners and technical staff, monitor workloads, provide training, and ensure compliance with audit requirements.
Gitlab
Support GitLab's hiring efforts by building recruitment strategies, managing pipelines, and enhancing the candidate experience across multiple time zones.
Cyberhaven
Design and implement components of Cyberhaven's system sensor for macOS endpoints, solve performance optimization challenges, build secure software, and contribute to the product architecture while collaborating with a distributed team.
Truelogic Software
Senior Full-stack Engineer needed for Truelogic, 5+ years of experience in software development required.
Trafilea
Trafilea is a dynamic tech e-commerce group that operates multiple direct-to-consumer brands in the intimate apparel and beauty sectors. The company fosters a culture of collaboration, innovation, and continuous learning, with a remote-first approach that allows employees to work from anywhere in the world. As a Data Cloud Engineer at Trafilea, you will design and maintain scalable data platforms in the cloud, ensuring high availability, performance, and reliability of our data infrastructure. You will work closely with various teams to enable real-time and batch data processing, ensure data security and compliance, and optimize cloud resources for large-scale analytics and machine learning workloads. With a focus on investing in people and providing development opportunities, Trafilea offers a safe space for employees to grow professionally and personally. The company has been recognized as one of the Top 25 Companies for Remote Workers by Forbes and FlexJobs.
About Trafilea
Trafilea is a dynamic and innovative Tech E-commerce Group that operates multiple direct-to-consumer brands in the intimate apparel and beauty sectors, with a focus on using data-driven strategies to scale their businesses. In addition to our products, we have our own online community dedicated to promoting body positivity. As a rapidly growing global player, Trafilea is committed to creating high-quality products and services that enhance the customer experience and drive long-term growth.
At Trafilea, we foster a culture of collaboration, innovation, and continuous learning. We believe in investing in our people and providing them with the support and development opportunities they need to grow both personally and professionally. With our remote-first approach, you'll have the freedom to work from anywhere in the world, surrounded by a diverse and talented team that spans the globe.
🌟 Role Mission
The Data Cloud Engineer is responsible for designing and maintaining scalable, secure, and efficient data platforms in the cloud. You will work closely with Data Engineers, DevOps, and Analytics teams to ensure high availability, performance, and reliability of our data infrastructure.
Your mission is to enable real-time and batch data processing, ensure data security and compliance, and optimize cloud resources to support large-scale analytics and machine learning workloads.
🛠️ Responsibilities
🔹 Cloud Data Infrastructure & Optimization
Design and implement scalable data architectures in AWS (Redshift, Athena, Glue, S3, DynamoDB).
Optimize data storage, processing, and retrieval strategies for performance and cost efficiency.
Implement data lake and data warehouse solutions to support analytics and BI teams.
🔹 Security & Compliance
Ensure data security, encryption, and access control policies are in place.
Implement GDPR, SOC 2, and compliance best practices for data governance.
Monitor and prevent data breaches, unauthorized access, and performance issues.
🔹 Observability & Performance Monitoring
Implement monitoring and alerting solutions for data pipelines (Datadog, Prometheus).
Troubleshoot performance bottlenecks and improve query execution times.
🔹 Collaboration & Knowledge Sharing
Work closely with Data Scientists, BI Analysts, and DevOps teams to align data solutions with business needs.
Document data workflows, best practices, and infrastructure standards.
Provide guidance on data engineering best practices and mentor junior engineers.
What we offer
Collaborate with world-class talents in a data-driven, dynamic, energetic work environment.
Opportunity to grow and develop both professionally and personally.
Safe space to be who you truly are, with a commitment to diversity, equity, and inclusion.
Openness to new ideas and initiatives.
Great benefits package including remote work, 15 working days of paid holidays, Learning subsidy, and more!
We've been recognized by Forbes and FlexJobs as one of the Top 25 Companies for Remote Workers. Apply now!
🎓 Qualifications
✅ Must-Have Skills
✔️ 3+ years of experience as a Data Cloud Engineer, Data Engineer, or Cloud Engineer.
✔️ Strong experience with AWS cloud services for data processing (Redshift, Athena, Glue, S3, DynamoDB, Kinesis).
✔️ Expertise in SQL and distributed databases (PostgreSQL, Snowflake, BigQuery, ClickHouse).
✔️ Hands-on experience with ETL/ELT pipelines, automation, and orchestration tools (Airflow, Step Functions, dbt).
✔️ Strong knowledge of Infrastructure as Code (Terraform, CloudFormation) for data infrastructure.
✔️ Experience working with large-scale data processing frameworks (Apache Spark, Flink, EMR).
✔️ Proficiency in Python, Bash, or similar scripting languages for automation.
✅ Nice-to-Have Skills
➕ Experience with streaming data architectures (Kafka, Kinesis, Apache Flink, RabbitMQ).
➕ Knowledge of machine learning pipelines and feature stores.
➕ Experience with Data Governance, Data Quality, and Security Best Practices.
➕ Understanding of FinOps principles for cost optimization in cloud data solutions.