Monday, September 26, 2016

Director Hadoop Ecosystem AIG Houston

Job Description: • 2-minute read •
AIG is seeking a Big Data Hadoop - Senior ETL Developer to lead a team in designing, developing, and deploying ETL solutions in a big data platform for the IT Life, Disability and Health pillar of the organization. The candidate will be part of the Information Management organization responsible for architecting and developing a Data Integration Hub enabling AIG to leverage data as an asset by providing a single authoritative view of the business, providing a layer of separation between our complex data sources and our data consumers; thereby, providing each data layer to evolve independently.
Watch: Career Advice
Responsibilities:

Architect, design, construct, test, tune, and deploy ETL infrastructure based on the Hadoop ecosystem based technologies.

Work closely with administrators, architects, and application teams to insure applications are performing well and within agreed upon SLAs.

Work closely with Management and Data Scientist teams to achieve company business objectives.

Collaborate with other technology teams and architects to define and develop solutions.

Lead/mentor developers in setting ETL architecture, design, and development standards.

Research and experiment with emerging ETL technologies and tools related to Big Data.

Contribute to the Big Data open source ecosystem.

Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed; ensuring a high degree of data quality.

Maintain, tune, and support the ETL platform on a day-to-day basis to insure high availability.

This is a hands-on role; you will lead by doing.


Position Requirements:
The Ideal Candidate will have:

Experience within the Life Insurance industry space preferred.

Excellent technical and organizational skills.

Strong communication and leadership skills.

Proficiencies with Agile development practices.

Experience with and strong understanding of Data Warehousing and Big Data Hadoop ecosystems.

Experience translating functional and technical requirements into technical specifications and design.

Knowledge and experience of ELT for Data Lake to ETL for the data servicing layer life cycle.

Experience with ELT/ETL batch, real-time, streaming, and messaging.

Experience with one or more of the following: Talend, AB-Initio, Informatica/Data Exchange.

Experience with Hadoop Map Reduce loading into a Data Warehouse.

Experience with HBase, Cassandra, DynamoDB, CouchDB a plus.

Experience with RDBMS technologies and SQL languages, Oracle & SQL Server a plus.

Experience with NoSQL platforms a plus.

Experience with Cloud solutions including admin. / deployment of data in AWS or Azure a plus.

Working knowledge of ACORD modeling a plus.

Working knowledge of web technologies and protocols (JAVA/NoSQL/JSON/REST/JMS) a plus.



Background and experience desired:

The position calls for a seasoned IT professional with senior leadership qualities and background with a minimum of 10 years’ experience in IT.

6+ years of experience building and managing complex ETL infrastructure solutions.

6+ years of experience with distributed, highly-scalable, multi-node environments utilizing Big Data ecosystems.

Education:

Bachelor's degree in Information Technology or related field preferred, or equivalent work experience.

10+ years of experience in the IT Industry.
Send To A Friend
Related Posts Plugin for WordPress, Blogger...