- Full Time
- Santa Clara (Remote)
We are looking to fill multiple Sr. Data Engineering roles, located in Santa Clara, CA, with potential for placement at various unanticipated client sites throughout the United States for project implementation. May require extended travel or relocation.
Requires Bachelor’s degree in Computer Science, Computer Information Systems, Electronics Engineering, or a related field.
Starting annual salary $121,000.00.
Job Duties:
- Collect and analyze data from various sources like databases, APIs, external data providers and streaming sources to ingest, process, and transform large volumes of structured and unstructured data from various sources.
- Develop and manage data warehouse solutions, including data modeling, schema design, and optimization for querying and analytics.
- Design and implement efficient data pipelines using latest cloud-based tools like Matillion and DBT to ensure a smooth flow of information into the data warehouse.
- Develop and implement processes for data quality assessment, validation, and cleansing to ensure accuracy and reliability of data using Matillion, DBT and Snowflake.
- Document Matillion and DBT data pipelines, processes, and system architectures, and provide training and knowledge sharing to team members.
- Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and deliver solutions that meet business needs.
- Manage data engineering projects, including scoping, planning, resource allocation, and coordination with stakeholders to ensure timely and successful delivery.
- Administer snowflake database, including configuration, tuning, and performance monitoring, to ensure data integrity, availability, and reliability.
- Design and optimize data models for efficient storage and querying within Snowflake. This includes defining schemas, organizing data into tables, and optimize queries for performance.
- Ensuring the security and compliance of data stored in Snowflake. This includes implementing access controls, encryption, and data masking.
- Responsible for monitoring the health and performance of Snowflake environments, identifying and troubleshooting issues, and implementing solutions to ensure high availability and reliability.
- Create documentation and provide training to users and other team members on best practices for using Snowflake, optimizing queries, and maintaining data quality.
- Collaborate with other team members, such as data analysts, data scientists, and software engineers, to understand data requirements and ensure that Snowflake solutions meet the needs of the organization.
- Effective communication skills are essential for conveying technical concepts and requirements to non-technical stakeholders.
- Responsible for designing and architecting ETL (Extract, Transform, Load) processes to move data from source systems to target data warehouses or data lakes.
- Work closely with stakeholders to understand data requirements and design efficient data integration workflows.
- Write and configure transformation logic using Matillion’s components and scripting capabilities. This involves performing data cleansing, validation, aggregation, enrichment, and other transformations to ensure data quality and consistency./li>
- Implement error handling mechanisms and logging to capture and handle exceptions during ETL execution. Monitor pipeline execution, track job status, and troubleshoot issues to ensure data integrity and reliability. Write SQL queries in DBT to transform and manipulate data within the data warehouse. This includes aggregating, filtering, joining, and pivoting data to meet business requirements.
- Use DBT’s capabilities to create reusable transformation logic and apply best practices for data transformation. Use version control systems like Git to manage ETL code and configurations.
- Maintain separate development, staging, and production environments and follow best practices for code promotion and deployment to ensure changes are tested and deployed safely.
To apply for this job email your details to badrin@cloudeqs.com
Comments are closed