They are seeking a Data Engineer who will partner with business, analytics, and engineering teams to design, build and maintain ease for use data structures to facilitate reporting and monitoring key performance indicators. Collaborating across disciplines, you will identify internal/external data sources to design table structure, define ETL strategy & automated QA checks and implement scalable ETL solutions.
- Partner with technical and non-technical colleagues to understand data and reporting requirements.
- Work with engineering teams to collect required data from internal and external systems.
- Design table structures and define ETL strategy to build performant Data solutions that are reliable and scalable in a fast-growing data ecosystem.
- Develop Data Quality checks for source and target data sets. Develop UAT plans and conduct QA.
- Develop and maintain ETL routines using ETL and orchestration tools such as Airflow, Luigi, and Jenkins.
- Document and publish Metadata and table designs to facilitate data adoption.
- Perform ad hoc analysis as necessary.
- Perform SQL and ETL tuning as necessary.
- Develop and maintain dashboards/reports using Tableau and Looker
- Coordinate and resolve escalated production support incidents in Tier 2 support rotation
- Create runbooks and actionable alerts as part of the development process
- 2+ years of relevant professional experience.
- 1+ years of work experience implementing and reporting on business key performance indicators in data warehousing environments. Strong understanding of data modeling principles including Dimensional modeling, data normalization principles, etc.
- 1 + years of experience using analytic SQL, working with traditional relational databases and/or distributed systems such as Hadoop / Hive, BigQuery, Redshift.
- Experience programming languages (e.g. Python, R, bash) preferred.
- 1+ years of experience with workflow management tools (Airflow, Oozie, Azkaban, UC4)
- Good understanding of SQL Engines and able to conduct advanced performance tuning
- Experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase)
- Familiarity with data exploration/data visualization tools like Tableau, Looker, Chartio, etc.
- Ability to think strategically, analyze and interpret market and consumer information.
- Strong communication skills – written and verbal presentations.
- Excellent conceptual and analytical reasoning competencies.
- A degree in an analytical field such as economics, mathematics, or computer science is desired.
- Comfortable working in a fast-paced and highly collaborative environment.
- Process-oriented with great documentation skills
- Salary Offer 0 ~ $3000
- Experience Level Junior
- Total Years Experience 0-5
- Dropdown field Option 1