2017-02-24

**Overview:**

The Big Data Engineer will have experience with data transport within Apache Hadoop Big Data architecture and experience with ELT specialty Data wrangling and transformation.

**Responsibilities:**

+ Ingest data from various structured data sources into Hadoop and other distributed Big Data systems (HDFS, HBase, Hive, Scoop).

+ Develop and support the sustainment and delivery of an automated ETL pipeline using Java, and other scripting tools (MapReduce, Python, Pig).

+ Validate data that is extracted from structured data inputs, databases, and other repositories using scripts and other automated capabilities, logs, and queries.

+ Enrich and transform extracted data, as required. Monitor and report the data flow through the ETL process.

+ Perform data extractions, data purges, or data fixes in accordance with current internal procedures and policies.

+ Track development and operational support via user stories and decomposed technical tasks in a provided issue tracking software, including GIT, Maven, and JIRA.

**Qualifications:**

+ 7 years overall IT experience and 3 years of toolset including

+ Hadoop-based Data Lake ecosystem

+ Apache Spark

+ Scala Programming Language

+ Apache Scoop

+ Hive

+ Enterprise Data Warehouse Development experience

+ ETL development experience using a major ETL tool (SSIS, Informatica, IBM, Oracle, etc)

+ Batch, Near-Real-Time and Micro-batch Integration Techniques

+ Change data Capture (CDC) techniques

+ Strong analytical and coding skills

+ Excellent communication skills

+ Microsoft Azure HD Insights

+ Apache Oozie

+ Apache Parque

+ Apache Hbase

+ Apache Thrift Server

+ Hadoop Management Tools (Ranger, Falcon, Ambari)

+ Metadata Mgt Tools (HCatalog, HiveMetastore, Allation)

+ Change data Capture (CDC) techniques

+ Hands-on SQL Server Integration Services (SSIS) experience

+ Mulesoft Integration Bus

+ Big Data integration solutioning, cost modeling and capacity planning

+ REST API development

+ BI Reporting Tool Experience (MicroStrategy, Cognos, SSAS, Tableau, Qlikview, etc)

+ Advanced Analytics Tool experience (SAS, SPSS, R etc)

+ Data Wrangling Tools (Alteryx, Trifacta, Paxata)

+ Logical, Physical and Dimensional Data Modeling

Acosta Sales & Marketing is an Equal Opportunity Employer

**Job ID** _2017-122094_

**Work City** _Jacksonville_

**PCN** _Sourcing Req_

**Work State** _US_ _-_ _FL_ _-_ _Jacksonville_

**Position Type** _Regular Full-Time_

**Work Zip** _32216_

**Starting average hours per week** _37.5 +_

Show more