top of page

ETL Data Engineer with Teradata

Location :

Charlotte, NC, USA

Job Type :

Hybrid

Experience :

8+ Years

About the Role

We Are Hiring 2.jpg

Responsibilities:
• Support the design, architecture and development of Enterprise Information / Data Science Products owned and operated by the Enterprise Data Science Technology (EDST) team
• Support the Payments Analytics work for the EDST systems engineering / application development team supporting Wholesale Risk Analytics (WRA) team in GRA Data Science (GRADS) and GTS (Global Treasury Services) with development on Enterprise Teradata and SASGRID platforms
• Develops, enhances, debugs, supports, maintains and tests data applications that support business units or supporting functions
• Supports end-to-end data-warehouse application architecture and development including requirements gathering, design, coding, testing and implementation of complex ETL and analytic applications and ETL workloads - SQL (Teradata)
• Work directly with product owners to understand requirements, and align to platform technology stack and design and document solutions / methods for data sourcing and provisioning
• As a Developer / Data Engineer, take accountability / ownership of the coding, testing and deployment framework for end-to-end ETL application development using agile methodology and SDM tools (JIRA, BitBucket) on the enterprise data platform(s)
• Supports systems through maintenance, modification, problem resolution to support ongoing delivery of application ETL and Analytics services and/or operations on the big data platform
• Serves as a fully seasoned/proficient technical resource - should be ready to get into the weeds of the code, analyze and research data and optimization problems, discuss technical details with the development/support team
• Provides tech knowledge and capabilities as team member and individual contributor, but also responsible for instructing, directing, and checking the quality and timeliness of other systems professionals, including offshore resources
• May lead multiple projects with competing deadlines
• Works under minimal supervision, with general guidance from manager

Requirements:
• Computer Science/Software Engineering (or related) degree
• 7+ years' experience with end-to-end ETL and Analytics application development on Teradata-based data-warehouse and analytical platforms
• Extensive experience developing Teradata SQL-based ETL and analytic workflows using native utilities (bteq, tpt, fastexport)
• Very good knowledge of Unix/Linux shell Scripting and scheduling (like Autosys)
• Knowledge and experience working with CI / CD based development and deployment – JIRA, BitBucket
• Excellent written, communication and diagramming skills
• Strong analytical and problem solving abilities
• Speaking / presentation skills in a professional setting
• Excellent interpersonal skills and a team player to work all along with Global teams and business partners
• Positive attitude and flexible
• Willingness to learn new skills and adapt to changes

Desired skills:
• Industry certifications in Analytics (Tableau), SQL (Teradata) and Big Data Technologies like Hadoop (Cloudera), Hive
• Experience working with Big Data Technologies, programs and toolsets like Hadoop, Hive, Sqoop, Impala, Kafka, and Python/Spark/PySpark workloads

Requirements

  • 7+ years' experience with end-to-end ETL and Analytics application development on Teradata-based data-warehouse and analytical platforms 

  • Extensive experience developing Teradata SQL-based ETL and analytic workflows using native utilities (bteq, tpt, fastexport) 

  • Very good knowledge of Unix/Linux shell Scripting and scheduling (like Autosys) 

  • Knowledge and experience working with CI / CD based development and deployment – JIRA, BitBucket 

*Send your resume by clicking the apply button

© 2023 BinTech Group LLC.  All Rights Reserved.

  • Facebook
  • Whatsapp
  • Linkedin
bottom of page