You are viewing a preview of this job. Log in or register to view more details about this job.

Associate Big Data SW Eng - Data Engineering for AI 2020 National

Responsibilities & Qualifications
KPMG is currently seeking an Associate for our Lighthouse - Data & Analytics – Software Engineer – Data Engineering for AI practice.

While this requisition may state a specific geographic office, please note that our positions are location flexible between our major hubs. Opportunities may include, but are not limited to, Atlanta, Chicago, Dallas, Denver, New York City, Orange County, Philadelphia, Seattle, Washington DC. Please proceed with applying here, and let us know your location preference during interview phase if applicable.

Responsibilities:
• Under supervising and mentoring of senior team members, rapidly prototype, develop and optimize data science implementations to tackle the Big Data and Data Science needs for a variety of Fortune 1000 corporations and other major organizations.
• Using waterfall/agile methodology, develop and maintain data science and AI solutions on-premise, cloud, KPMG-hosted, hybrid infrastructure. Follow software engineering guidelines and industry best practices for code quality; conducting regular design and code review and building technical documentation.
• Own data in cross-disciplinary teams. Discover, profile, acquire, process, model, and own data for the solutions.
• Implement data processing pipelines, data mining and data science algorithms, and visualization engineering to help clients distill insights from rich data sources (social media, news, internal or external documents, emails, financial data, client data, and operational data).
• Help in research and experiment of leading and emerging Big Data methodologies (serverless data lake, microservices, Hadoop, Spark, Kafka, AWS, MS Azure, GCP) and apply them to real world client problems.
• From data engineering point of view, help in the process for pursuing innovations, target solutions, and extendable platforms for Lighthouse, KPMG, and client.

Qualifications:
• Bachelors, Masters or PhD from an accredited college or university in Computer Science, Computer Engineering, or related field. Preferred: relevant software development exposure in related industries, ideally in professional services.
• Hands-on knowledge of software engineering: waterfall vs agile; CI/CD; object-oriented vs procedural vs functional; source code version control, continuous integration, continuous development/deployment, design patterns, etc.; familiarity with mainstream cloud infrastructures—Amazon Web Services, Azure, Google Cloud Platform—and their AI-related Microservices, and how to implement data lake and serverless data lake.
• Proficiency of Linux/Unix/Windows/.NET; market-leading fluency in several programming languages preferred (Bash/ksh/PowerShell; Python/Perl/R, Java/C/C++/Scala); expertise in distributed computing architecture, massive-parallel processing big data platforms (Hadoop, MapReduce, HDFS, Spark, Hive/Impala, H-Base/MongoDB/Casandra, Teradata/Netezza/Redshift etc.).
• Ability to travel up to 80% of the time, depending on project assignments.
• Targeted graduation date Fall 2019 through Summer 2020
Work Authorization
Applicants must be currently authorized to work in the United States without the need for visa sponsorship now or in the future.