You are viewing a preview of this job. Log in or register to view more details about this job.

Associate D&A Engineer - Big Data Systems for AI 2020 National

Responsibilities & Qualifications
KPMG is currently seeking an Associate for our Lighthouse - Data & Analytics Engineer - Big Data Systems for AI practice.

While this requisition may state a specific geographic office, please note that our positions are location flexible between our major hubs. Opportunities may include, but are not limited to, Atlanta, Chicago, Dallas, Denver, New York City, Orange County, Philadelphia, Seattle, Washington DC. Please proceed with applying here, and let us know your location preference during interview phase if applicable.

Responsibilities:
• Rapidly prototype, implement, and optimize architectures to tackle the Big Data and Data Science needs for a variety of Fortune 1000 corporations and other major organizations who are KPMG clients; develop modular code base to solve "real" world problems while conducting regular peer code reviews to ensure code quality and compliance following best practices in the industry.
• Work in cross-disciplinary teams with KPMG industry experts to understand client needs and ingest rich data sources such as social media, news, internal and external documents, emails, financial data, and operational data.
• Develop and maintain data science and AI solutions on premise, cloud, KPMG-hosted, or hybrid infrastructure.
• Help with research and experiment of leading and emerging BI/EDW/Big Data methodologies such as serverless data lake, AWS Redshift, Athena, Glue, GCP Bigquery, and MS PowerBI and apply them in solving real world client problems. Be the team champion of some mainstream BI/EDW/Big Data toolsets like Tableau, Alteryx, Informatica, Pentaho, Er-Win, and Power Designer.
• Help drive the process for pursuing innovations, target solutions, and extendable platforms for Lighthouse, KPMG, and client.
• Participate in developing and presenting opinions on emerging technologies, and assist in ensuring that the Lighthouse technology stack incorporates, and is optimized for, using specific technologies.

Qualifications:
• Bachelors, Masters or PhD from an accredited college or university in Computer Science, Computer Engineering, or related fields. Preferred: relevant work experience with software development exposure to multiple programming languages and technologies, ideally related to professional services.
• Ability to pick up and learn new technologies quickly; familiarity with object-oriented design, coding and testing patterns as well as familiarity with engineering (commercial or open source) software platforms and large-scale data infrastructures; working knowledge of RDBMS design, data modeling, MPP EDW system implementation; hands-on practice and knowledge in distributed computing architecture, massive-parallel processing big data platforms (Hadoop, MapReduce, HDFS, Spark, Hive/Impala, H-Base/MongoDB/Casandra, Teradata/Netezza/Redshift etc.).
• Hands-on exposure to BI/EDW/Big Data toolsets (Tableau, Alteryx, Informatica, Pentaho, Er-Win, Power Designer); practice with and strong knowledge of mainstream cloud infrastructures (AWS, MS Azure and GCP including Microservices); ability to implement data lake and serverless data lake; fluency in SQL; hands-on exposure to Linux/Unix/Windows/.NET; fluency in several programming languages (Bash/ksh/Powershell; Python/Perl/R); understanding of programming methodologies (version control, testing, QA, CI/CD) and development methodologies (Waterfall and Agile); full-stack development capability is preferred.
• Ability to travel up to 80% of the time, depending on project assignments.
• Targeted graduation date Fall 2019 through Summer 2020

Work Authorization
Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future.