U.S. Bank Senior Big Data ETL Developer in Fargo, North Dakota
At U.S. Bank, we're passionate about helping customers and the communities where we live and work. The fifth-largest bank in the United States, we’re one of the country's most respected, innovative and successful financial institutions. U.S. Bank is an equal opportunity employer committed to creating a diverse workforce. We consider all qualified applicants without regard to race, religion, color, sex, national origin, age, sexual orientation, gender identity, disability or veteran status, among other factors.
US Bank is seeking an experienced Senior Big Data ETL Developer in support of our critical Enterprise Landing Zone (ELZ) environment, a Hadoop and Informatica based sourcing solution for the bank. We are in the early phases of building out this highly visible critical asset for the bank and have various future plans to expand the usage and functionality beyond sourcing. Candidate must show background in ETL development using current Big Data ETL products, like Informatica Big Data Edition (BDE) and Talend. Candidate will be responsible for designing, developing, testing and supporting ETL jobs throughout the SDLC. Ability to analyze and resolve complex technical problems is a must. The ideal candidate will be able to display the willingness to learn new skills. Ideally, the hire for this position will be able to transition into new work streams, including but not limited to native Hadoop development. Successful applicants will exemplify US Bank's ethical principles of uncompromising integrity, respect for others, accountability for decisions and actions, and good citizenship.
Successful applicants will exemplify US Bank's ethical principles of uncompromising integrity, respect for others, accountability for decisions and actions, and good citizenship.
-Bachelor's degree or equivalent work experience.
-At least 7 years of experience with design, development, automation, and support of applications to extract, transform, and load data.
-At least 5 year’s experience within a total information technology (IT) environment.
-At least 2 year’s experience with Informatica PowerCenter Big Data Edition (BDE ) or Talend.
-At least 2 years application development experience with Java/Python.
-Excellent verbal and written communication skills, including communicating technical issues to non-technical audience.
-Self-motivated, Inclination to learn new things and ability to adapt and change and make progress.
-Knowledge of software development lifecycle.
-Minimum of 5 years of experience with Relational databases and working with large data warehouses.
-Minimum of 2 year of experience with Hortonworks distribution leveraging Hadoop ecosystem components for Batch and Real-time Data Ingestion, Data Integration and Data Consumption ( HDFS, YARN, Hive, Hbase, Map-Reduce, SPARK, Talend/Informatica/Datastage, Sqoop, Flume, Kafka).
-Strong expertise in software engineering development and testing life cycles using Java, Python, Scala and other Scripting.
-Minimum of 5 years of hands-on experience with Unix/Linux shell scripting.
- Experience in using or developing Rest API or other API’s for Data ingestion and Data Consumption.
-Performance tuning and optimization techniques of Hive QL to process big data from heterogeneous sources.
-Excellent Problem solving and Technical troubleshooting skills to provide application support.
-Ability to handle multiple projects and prioritize tasks in a rapidly changing environment.
Job: Information Technology
Primary Location: United States
Shift: 1st - Daytime
Average Hours Per Week: 40
Requisition ID: 170018601