Job Description

Responsibilities

  • Develop high quality data processing infrastructure and scalable services that are capable of ingesting and transforming data at huge scale coming from many different sources on schedule.
  • Turn ideas and concepts into carefully designed and well-authored quality code.
  • Articulate the interdependencies and the impact of the design choices.
  • Develop APIs to power data driven products and external APIs consumed by internal and external customers of data platform.
  • Collaborate with QA, product management, engineering, UX to achieve well groomed, predictable results. 
  • Improve and develop new engineering processes & tools.
  • Fluidly adapt to changes and new requirements.

Knowledge and Experience

  • 7+ Years’ experience working on UI projects using HTML, CSS, JavaScript
  • Minimum 4+ year’s experience working with ReactJS
  • 3+ Years experiencing developing UI screens which consumes Rest APIs. Good understanding of API calls and parsing the results returned by API’s
  • Experience building REST based micro services in a distributed architecture along with any cloud technologies. (AWS preferred) - good to have 
  • Strong understanding of Web Programming
  • Good knowledge of Jira, Git, Jenkins
  • Development experience using Agile
  • Hand’s on experience with performance improvements while loading a web page and calling API’s
  • Experience in object-oriented design and development with languages such as Java 
  • Knowledge in Java/J2EE frameworks like Spring Boot, JPA, JDBC and related frameworks are added advantage
  • Experience with a variety of data stores for unstructured and columnar data as well as traditional database systems, for example, MySQL, Postgres
  • Proven ability to deliver working solutions on time
  • Strong analytical thinking to tackle challenging engineering problems.
  • Great energy and enthusiasm with a positive, collaborative working style, clear communication and writing skills.
  • Experience with working in DevOps environment – “you build it, you run it”
  • Built high throughput real-time and batch data processing pipelines using Spark, Kafka, on AWS environment with AWS services like S3, Kinesis, Lamdba, RDS, DynamoDB or Redshift.
  • Experience with big data technologies and exposure to Hadoop, Spark, AWS Glue, AWS EMR etc 

Schedule

This role offers work from home flexibility of up to 2 days per week. 

Company Info.

Intercontinental Exchange, Inc.

Intercontinental Exchange, Inc. is an American company formed in 2000 that operates global financial exchanges, clearing houses and provides mortgage technology, data and listing services.

  • Industry
    Financial services
  • No. of Employees
    8,858
  • Location
    Atlanta, GA, USA
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

Intercontinental Exchange, Inc. is currently hiring User Researcher Jobs in Pune, Maharashtra, India with average base salary of ₹90,000 - ₹250,000 / Month.

Similar Jobs View More