Posted on:7 Jan 2023
BACK TO SEARCH
Java Programming, Oracle, Python Programming, SQL, Apache Kafka, NoSQL, R Programming, MySQL, SPARK Programming, RabbitMQ, AWS Kinesis
Our Enterprise Data Analytics Platform (EDAP) is designed to explore and extract insights that our customers depend upon. Being the primary source for analytics, our customers both internal and external depend upon us to provide accurate, real-time, and fault tolerant solutions to their ever-growing data needs. The Enterprise Data Analytics Platform builds highly performant scalable analytics solutions varying from data storage systems to computation and serving solutions. We utilize a plethora of open source and industry accepted technologies for our big data problems such as Apache Spark, Apache Storm, Amazon Web services, and Apache Kafka.
We are looking for a skilled Software Engineer with an eye for building and optimizing distributed systems to join our team. From data ingestion, processing and storage, to serving and scale, we work closely with other engineers and product management to build consistent and highly available systems that tackle real world data and scale problems.
This role is scoped as an individual contributor.
Focus on the development of cloud computing infrastructure, and help build, distribute, scale and optimize these technologies
- Designs and implements cloud scale distributed data-focused Data Analytics Platform, services and frameworks including solutions to address high-volume and complex data collection, processing, transformation and reporting for analytical purposes
- Writes code and unit tests, works on API specs, automation, and conducts code reviews and testing
- Develop scalable, robust, and highly available services related to our data platform
- Implement new features and optimize existing ones to drive maximum performance
- Owns technical aspects of software development and identifies opportunities to adopt innovative technologies.
- Identifies continuous improvements for service availability
- Evaluates and recommends tools, technologies and processes to ensure that the services that the team provides achieve the highest standards of quality and performance
- Debugs and troubleshoots problems in data flow, lineage, transformation and other stages of the ETL pipelines.
- Collaborates with other peer organizations (e.g., Business Analyst, Data Modeler, QA, SRE, technical support, etc.) to prevent and resolve technical issues and provide technical guidance.
- Work in Agile development environment. Attend daily stand-up meetings, collaborate with your peers, prioritize features, and work with a sense of urgency to deliver value to your customers
- Bachelor’s degree
- 5 years of meaningful experience
- A strong level of curiosity paired with the ability to get things done.
- Strong algorithms, data structures, and coding background.
- Experience with software engineering standard methodologies (e.g. unit testing, code reviews, design document)
- Java or Scala or other programming experience
- Strong in SQL
- Experience building products using 1 of the following distributed technologies:
- Relational Stores (i.e Postgres or MySQL or Oracle)
- Columnar or NoSQL Stores (i.e Vertica or Redshift or Cassandra or DynamoDB)
- Distributed Processing Engines (i.e Apache Spark or Apache Storm or Celery)
- Distributed Queues (i.e Apache Kafka or Kinesis or RabbitMQ)
- Has worked with systems processing large amounts of data
- Has worked with partner data scientist, data analysts and other domain experts to understand their needs and be able to develop solutions
- Experience working with AWS or similar cloud platform technologies
- Authorized to work in the United States with or without sponsorship.
- Bachelor's degree in Computer Science preferred
- 8 years of meaningful development, experience is preferred
- Knowledge of open source/industry standard data processing, storage, and serving technologies.
- Ability to take proactive, problem-solving/troubleshooting approach to identifying the root cause of issues and solving problems
- Working with Data Warehousing is a plus
- Experience with source control systems, Git preferred
- Must be able to quickly grasp the platform/application overarching design and ensure development is executed in accordance with the present design
- Ability to drive change through persuasion and consensus
- Experience in Financial/Insurance industrial
The Massachusetts Mutual Life Insurance Company, also known as MassMutual, is a Springfield, Massachusetts-based life insurance company. MassMutual provides financial products such as life insurance, disability income insurance, long term care insurance, retirement/401 plan services, and annuities.
Get Similar Jobs In Your Inbox
MassMutual is currently hiring Software Engineer Jobs in Boston, MA, USA with average base salary of $120,000 - $190,000 / Year.