AWS, Azure, Big Data Technology, Continuous Integration & Continuous Delivery - CI/CD, Data Architecture, Data Modeling, Data science techniques, Data Visualization, Data Warehousing, ETL frameworks, Google BigQuery, Google Cloud Platform (GCP), JIRA, Machine learning techniques, Python Programming, SQL
About the role:
The Data Corpus team is responsible for building and maintaining SoundCloud’s petabyte-scale GCP data warehouse. You will build and maintain the warehouse to abstract the complexity of SoundCloud’s vast data eco-system to exploit and get value from data. You will work closely with business reporting, data science, and product teams. You will gather and refine requirements, design data architecture and solutions, and build ETL pipelines using Airflow to land data in BigQuery and beyond. You will have knowledge and passion for analytics engineering. You are a strong individual contributor who thrives in a dynamic environment and is motivated by solving challenging data problems.
About you:
As a Senior individual contributor, you will be able to manage stakeholders, plan and lead technical projects, and identify and focus the business value of projects.
Must have:
Nice to have:
SoundCloud is an online audio distribution platform and music sharing website that enables its users to upload, promote, and share audio, as well as a digital signal processor enabling listeners to stream audio.