Job Description

The purpose of this job is to research, develop, document, and modify extract, transform, load  processes and jobs in support of a Big Data infrastructure, as well as data warehouse (EDW) for Pilot Flying J.   Focus will be on Enterprise AWS Cloud environment, with an emphasis in EMR and Redshift.

1.      Research, develop, document, and modify Big Data processes and jobs according to data architecture and modeling requirements, and processes set forth by Business Intelligence (BI) Data senior team members

2.      Maintain accountability for completing key EDW/Data Lake project activities; communicate project status and risks to project leader

3.      Troubleshoot issues including connection, failed jobs, application errors, server alerts, and space thresholds within predefined service level agreements (SLAs)

4.      Proactively maintain and tune code according to EDW/Big Data best practices

5.      Review and ensure appropriate documentation for all new development and modifications of the Big Data processes and jobs

6.      Perform unit testing for solutions developed, and ensure integrity and security of institutional data

7.      Under the guidance of lead data team members, take technical specifications and complete development per specifications. 

8.      Model behaviors that support the company’s common purpose; ensure guests and team members are supported at the highest level

9.      Ensure all activities are in compliance with rules, regulations, policies, and procedures of the organization

10.   Complete other duties as assigned


Qualifications


·        High school diploma or equivalent certification required

·        Bachelor’s degree in computer science, engineering, information technology or related field preferred

 

 

Qualifications, cont.

·        Minimum 1-3 years’ technology operations experience required

Specialized Knowledge

·        Strong SQL knowledge and skills required

·        Knowledge of Postgres and SQLServer preferred

·        Familiarity with AWS or similar data platform required

·        Knowledge of ETL tools such as Informatica or Datastage a plus

·        Familiarity with Enterprise Data Warehouse data models with a focus on Star Schemas preferred

·        Experience with Spark/Spark-streaming, Python, Hive, Kafka preferred

·        PFJ institutional knowledge preferred





5508 Lonas Rd
Knoxville
TN

Application Instructions

Please click on the link below to apply for this position. A new window will open and direct you to apply at our corporate careers page. We look forward to hearing from you!

Apply Online