Key Skills
Agile methodologies, Apache Hadoop, AWS, Big Data Technology, Data Modeling, Data science techniques, Database, Elasticsearch, HBase, Java Programming, MapReduce, Python Programming, Scala Programming, SPARK Programming
Job Description
This position will be a part of a growing team working towards building world class large scale Big Data architectures. This individual should have a sound understanding of programmingprinciples, experience in programming in Java, Python or similar languages and can expect to spend a majority of their time coding.
Responsibilities:
Good development practices
- Hands on coder with good experience in programming languages like Java, Python or Scala.
- Hands-on experience on the Big Data stack like Hadoop, Mapreduce, Spark, Hbase, and ElasticSearch.
- Good understanding of programming principles and development practices like checkin policy, unit testing, code deployment
- Self starter to be able to grasp new concepts and technology and translate them into large scale engineering developments
- Excellent experience in Application development and support, integration development and data management.
Align Sigmoid with key Client initiatives
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid
- Hands on coder with good understanding on enterprise level code
- Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
- Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
Culture
- Must be a strategic thinker with the ability to think unconventional out:of:box.
- Analytical and data driven orientation.
- Raw intellect, talent and energy are critical.
- Entrepreneurial and Agile understands the demands of a private, high growth company.
- Ability to be both a leader and hands on "doer".
Qualifications:
- Years of track record of relevant work experience and a computer Science or related technical discipline is required
- Experience with functional and object-oriented programming, Java, Python or Scala is a must.
- Hands-on experience on the Big Data stack like Hadoop, Mapreduce, Spark, Hbase, and ElasticSearch.
- Good understanding on AWS services and experienced in working with API’s, microservices.
- Effective communication skills (both written and verbal
- Ability to collaborate with a diverse set of engineers, data scientists and product managers
- Comfort in a fast-paced start-up environment
Preferred Qualification:
- Experience in agile methodology
- Experience with database modeling and development, data mining and warehousing.
- Experience in architecture and delivery of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges, propose comprehensive solutions and guide junior staff
- Experience working with large, complex data sets from a variety of sources
Company Info.
Sigmoid
Sigmoid enables business transformation using data and analytics, leveraging real-time decisions through insights, by building modern data architectures using cloud and open source technologies.
-
Industry
Artificial intelligence
-
No. of Employees
700
-
Location
San Francisco, CA, USA
-
Website
-
Jobs Posted
Get Similar Jobs In Your Inbox
Sigmoid is currently hiring Python Developer Jobs in Hyderabad, Telangana, India with average base salary of ₹90,000 - ₹250,000 / Month.