Randstad Sr Software Engineer- Data Platform in Waltham, Massachusetts

Sr Software Engineer- Data Platform

job details:

  • location:Waltham, MA

  • date posted:Friday, October 5, 2018

  • job type:Permanent

  • industry:Professional, Scientific, and Technical Services

  • reference:649325

job description

Sr Software Engineer- Data Platform

job summary:

Today's enterprises are leveraging data lake architectures to enable new ways to empower analytics, business intelligence, and new product features. Our client is using this data lake design to build a Data Platform in the cloud. We're leveraging the latest AWS services to build a cutting edge, highly scalable and cost effective platform.

location: Waltham, Massachusetts

job type: Permanent

work hours: 9 to 5

education: Bachelors

responsibilities:

Responsibilities:

  • Be a key leader and contributor to the design and development of a scalable and cost effective cloud based data platform based on a data lake design

  • Be collaborative with team members, Product Management, Architects, data producers and data consumers throughout the company

  • Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties

  • Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data

  • Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services

  • Build and deliver cloud based deployment and monitoring capabilities consistent with DevOps models

  • Keep knowledge and skills current with the latest cloud services, features and best practices

qualifications:

  • Experience developing big data, business intelligence, marketing automation and/or other analytics infrastructure or pipelines - data lake experience preferred

  • Development experience in a cloud environment using Amazon's AWS (certification preferred)

  • Experience with data streaming technologies (Kinesis, Kafka, Spark Streaming) and real time analytics

  • Working experience and detailed knowledge in Java, Scala or Python

  • Knowledge of ETL, Map Reduce and pipeline tools (Glue, EMR, Spark)

  • Experience with large or partitioned relational databases (Aurora, MySQL, DB2)

  • Experience with NoSQL databases (DynamoDB, Cassandra)

  • Agile development (Scrum) experience

  • Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery

  • Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation

skills: Python, AWS. More back end focused

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.