Senior Data Engineer (data vault)
Company details
Company: Workato
Job type: Remote
Country: United States
City: Los Angeles
Region: California
Experience: 4 years or more
Description of the offer
Overview
At Workato, we’re redefining business automation by integrating innovative technologies that drive digital transformation. We’re seeking a highly skilled Senior Data Engineer to lead the design, development, and optimization of our modern data infrastructure. In this role, you will work extensively with advanced tools such as dbt, Automate DV, Trino, Snowflake, Apache Iceberg, and Apache Airflow to build robust, scalable, and efficient data pipelines that empower our decision-making and analytics capabilities.
You will work closely with data scientists, providing data vault for them, integrating models to the data vault, integrating different sources of the data to single data warehouse:
- Product usage data
- ETL data from AI services
- Business data
- External data
Key Responsibilities
- Data Pipeline Development:
- Design, develop, and maintain data pipelines and ETL processes using dbt and Apache Airflow to ensure seamless data integration, transformation, and validation across diverse data sources.
- Data Infrastructure Management:
- Architect and implement scalable data solutions utilizing Snowflake as a data warehouse and leverage Trino for efficient query execution across distributed data sets.
- Modern Data Technologies:
- Integrate and optimize data workflows using Automate DV and Apache Iceberg to manage data versioning, quality, and lifecycle, ensuring reliability and compliance.
- Collaboration & Leadership:
- Work closely with data scientists, analysts, and business stakeholders to translate requirements into technical solutions. Mentor junior engineers and lead code reviews to promote best practices in data engineering.
- Performance & Optimization:
- Continuously monitor, troubleshoot, and optimize data processes to ensure high performance, minimal downtime, and optimal resource utilization.
- Innovation & Best Practices:
- Stay abreast of emerging trends in data engineering and automation, driving innovation and adopting new tools and techniques that enhance data processing and integration capabilities.
Qualifications
Education & Experience:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, with a proven track record of designing and managing large-scale data infrastructures. Technical Expertise:
- Proficiency in dbt for data transformation and modeling.
- Experience with Automate DV for data validation and workflow automation.
- Hands-on expertise with Trino for distributed SQL query engines.
- Deep understanding of Snowflake architecture and its ecosystem.
- Knowledge of Apache Iceberg for managing large analytic datasets.
- Strong background in orchestrating workflows using Apache Airflow.
- Proficiency in SQL and at least one programming language (Python preferred). Analytical & Problem-Solving Skills:
- Ability to analyze complex data challenges and design innovative, data-driven solutions.
- Strong debugging skills and attention to detail.
- Soft Skills:
- Excellent communication and collaboration skills.
- Demonstrated leadership and mentoring capabilities.
- Ability to thrive in a fast-paced, dynamic environment. Preferred Qualifications
- Familiarity with cloud data platforms (AWS, GCP, or Azure) and containerization technologies.
- Experience in agile development methodologies.
- Proven track record of working in automation-centric environments.
What We Offer
- Competitive salary and comprehensive benefits package.
- Opportunities for professional growth and continuous learning.
- A dynamic and innovative work environment where your ideas make a real impact.
- The chance to work with cutting-edge technologies alongside passionate industry experts.
Location of employment
How to apply?
Click on the button to get the company email or employment application form.
Apply with External LinkSponsored ads
