Job Title:
Apps Systems Engineer 5 (Hadoop Developer)
Requisition Number:
3867271
Schedule Type:
Regular
Work Hours:
40.00
Telecommute Option:
Yes
MN-Minneapolis
CA-SF-South Of Market Area
AZ-Chandler
NC-Charlotte
MO-Saint Louis
Location:
Job Description
Location: Manager may be open to any domestic location within the United States dependent on applicant experience.
This position sits within our Enterprise Data and Analytics (EDA) Division focusing on Big Data. The EDA Big data platform supports Hadoop and Aster technology in a multi-tenant environment supporting multiple LOBs.
The EDA Big data team is looking for an Application System Engineer to work in a fast-paced agile development environment to act as a liaison between the business and the Hadoop development team; help capture the business requirements and translate them in to the functional requirements for developers. Help with data management (tracking of sources in the big data environment), system documentation, metadata capture
quickly analyze, develop, and test potential use cases for the business; take valid use cases from ideation through development to production; write efficient code to extract, transform, load, and query very large datasets, including unstructured data; develop standards and new design patterns for Big Data applications; build out and populate a data lake; understand MapReduce concepts and master the tools and technology components within the Hadoop and Aster environments; help formulate use cases from potentially ambiguous business requirements; provide timely communication to business partners on use cases and project status; mentor and assist users in accessing the data; contribute significantly toward developing a roadmap for Big Data within the existing data warehousing architecture.
Basic Qualifications
7+ years application development and implementation experience.
Minimum Qualifications
– 1+ year Hadoop Experience including Map reduce, HDFS, HIVE, PIG on a major distribution such as Hortonworks or Cloudera
– 5+ years of experience with data management and manipulation such as ETL
– 5+ years of Unix shell programming experience
– 5+ years of SQL and relational database experience
– 3+ years of experience in large scale data warehousing, distributed computing, and parallel processing
– Experience working with business partners on analytical solutions
– Demonstrated effective and successful verbal and written communication skills
Preferred Skills
– Java and/or C/C++ programming experience
– Experience with multiple databases Teradata, DB2, Oracle etc.
– Hadoop developer certification
– Experience with unstructured datasets, such as log files, email, text
– Experience as a scrum master or agile development methodologies
– Experience with Ab Initio development and implementation
– Bachelor s degree in Computer Science or related technological degree.